Hyunchul Kim is a researcher, developer, and interaction designer exploring the intersection of technology, society, and art. He is currently pursuing a PhD at KAIST’s Graduate School of Culture Technology under the supervision of Prof. Jeongmi Lee. His research focuses on embodied communication experiences—mediated or generated by computers—using behavioral and physiological data. In addition to his academic work, he has contributed as a developer to various art projects. He is also training for his fourth marathon, which he plans to run in March 2025.
Quantifying Social Connection With Verbal and Non-Verbal Behaviors in Virtual Reality Conversations, Hyunchul Kim and Jeongmi Lee, CHI 2025
This study identifies verbal and nonverbal behavioral indices of perceived social experience in virtual conversations. Analyzing turn-taking patterns and behavioral data from dyadic VR conversations, we find that rapid response time, longer speech and gaze duration, and increased nodding predict dynamic social experience changes. These findings offer objective, unobtrusive measures to enhance social VR interactions.
Quantifying Proactive and Reactive Button Input, Hyunchul Kim, Kasper Hornbæk, and Byungjoo Lee, CHI 2022
We propose a technique to quantify users' reactiveness and proactiveness in button inputs by modeling the likelihood distribution of input-output timing. Using only screen recordings and input logs, our method estimates the probability of each strategy and its parameters. Empirical studies demonstrate its application in designing animated transitions and predicting player scores in real-time games.
Automated Playtesting with a Cognitive Model of Sensorimotor Coordination, Injung Lee, Hyunchul Kim, and Byungjoo Lee, ACM Multimedia 2021
We propose an automated playtesting technique based on a cognitive model of sensorimotor coordination to predict game difficulty for players with different skill levels. Applying this method to two MTA games, it is possible to estimate difficulty at the population level (e.g., seniors).
Posters & Demos
Turn-taking Patterns Reflect Social Connection in Virtual Reality Conversations, Hyunchul Kim and Jeongmi Lee, ISMAR Poster 2024 Best Poster Award ୧( “̮ )୨✧
Chameleon : VR Drawing Application, Hyunchul Kim, Nayeon Kim, Sunghye Kwon, and Shinho Moon, VR/AR Grand Challenge 2018 the Minister of Science and ICT Award ヽ( ´¬`)ノ
Predicting the Attentional State under Varying Cognitive Load Levels Using Physiological Data, Hyunchul Kim¹, Eugene Hwang¹, Hyunyoung Jang, and Jeongmi Lee, The Annual Conference of the KSCBP 2025
Quantifying Social Connection With Verbal and Non-Verbal Behaviors in Virtual Reality Conversations, Hyunchul Kim and Jeongmi Lee, The Annual Conference of the KSCBP 2025 outstanding presentation award (*ˊᵕˋ*)ノ
The Effect of Cognitive Demand on the Strategy of Utilizing Peripheral Vision, Hyunchul Kim, Dawon Lee, Junyong Noh, and Jeongmi Lee, The Annual Conference of the KSCBP 2023 outstanding presentation award ง •̀_•́)ง
The Effect of Target Speed and Uncertainty on Shooting Game Performance, Hyunchul Kim and Jeongmi Lee, The Annual Conference of the KSCBP 2022
Unpublished Works
LeverPointing : a VR pointing technique for distant object Collaboration with Soomin Park, Maria Kostelova, Taehee Kim, Sunghyun Kim, Inseo Jang, and Byungjoo Lee
StringFab Collaboration with Chaeeun Lee, Jaehoon Pyo, and Aigerim Shunayeva
CURVEilance : a Multi-user-based Interactive Medial Wall, SIGGRAPH Asia 2020 Art Gallery, Online, Tokyo, - Collaboration with Seonghyeon Kim, Jiyoung Jun, and Jooyoung Oh
____ in the loop, PLAY ON AI 2024, Art Center Nabi, Seoul, - Collaboration with Chanu Lee, Michael Park, and Soomin Park
The Skin, Daejeon AI Biennale: Sunshine Misses Windows, Daejeon Museum of Art, Daejeon, - Collaboration with Sanghwa Hong*, Seonghyeon Kim, and Byungjoo Lee