Enhancing Sense of Embodiment through Automated Rubber Hand Illusion and Its Assessment
项目来源
项目主持人
项目受资助机构
项目编号
立项年度
立项时间
项目级别
研究期限
受资助金额
学科
学科代码
基金类别
关键词
参与者
参与机构
1.Electro-Oculography and Proprioceptive Calibration Enable Horizontal and Vertical Gaze Estimation, Even with Eyes Closed
- 关键词:
- electro-oculography; signal processing; eyes closed; gaze directionestimation;MOVEMENTS; PERFORMANCE; TRACKING; SLEEP
- Wei, Xin;Dollack, Felix;Kiyokawa, Kiyoshi;Perusquia-Hernandez, Monica
- 《SENSORS》
- 2025年
- 25卷
- 21期
- 期刊
Eye movement is an important tool used to investigate cognition. It also serves as input in human-computer interfaces for assistive technology. It can be measured with camera-based eye tracking and electro-oculography (EOG). EOG does not rely on eye visibility and can be measured even when the eyes are closed. We investigated the feasibility of detecting the gaze direction using EOG while having the eyes closed. A total of 15 participants performed a proprioceptive calibration task with open and closed eyes, while their eye movement was recorded with a camera-based eye tracker and with EOG. The calibration was guided by the participants' hand motions following a pattern of felt dots on cardboard. Our cross-correlation analysis revealed reliable temporal synchronization between gaze-related signals and the instructed trajectory across all conditions. Statistical comparison tests and equivalence tests demonstrated that EOG tracking was statistically equivalent to the camera-based eye tracker gaze direction during the eyes-open condition. The camera-based eye-tracking glasses do not support tracking with closed eyes. Therefore, we evaluated the EOG-based gaze estimates during the eyes-closed trials by comparing them to the instructed trajectory. The results showed that EOG signals, guided by proprioceptive cues, followed the instructed path and achieved a significantly greater accuracy than shuffled control data, which represented a chance-level performance. This demonstrates the advantage of EOG when camera-based eye tracking is infeasible, and it paves the way for the development of eye-movement input interfaces for blind people, research on eye movement direction when the eyes are closed, and the early detection of diseases.
...2.Large Language Models as Perceivers of Dynamic Full-Body Expressions of Emotion
- 关键词:
- Behavioral research;Human computer interaction;Affective Computing;Body motions;Full body;Human emotion;Human judgments;Human motions;Human perception;Language model;Large language model;Motion description
- Liu, Huakun;Cheng, Miao;Wei, Xin;Dollack, Felix;Schneider, Victor;Uchiyama, Hideaki;Kitamura, Yoshifumi;Kiyokawa, Kiyoshi;Perusquia-Hernandez, Monica
- 《27th International Conference on Multimodal Interaction, ICMI 2025》
- 2025年
- October 13, 2025 - October 17, 2025
- Canberra, ACT, Australia
- 会议
Human emotion expressed through body motions is often interpreted differently by different perceivers. We explored the use of large language models (LLMs) to simulate humans’ judgment variability and model the distribution of perceived emotions from body motion. Given textual motion descriptions derived from a multiactor dataset covering diverse emotional contexts, we prompted LLMs to generate emotion probability distributions. We compared LLM-generated outputs with human perception distributions and performer-intended emotion labels. The model showed strong perceptual alignment with human perceivers and achieved an accuracy of 63.2% in identifying the intended emotions, comparable to that of human perceivers. We further investigated the effects of context and scoring constraints, showing that both factors influence the scoring of the model. Our findings suggest that LLMs can serve as effective proxies for modeling emotion perception distributions from motions, supporting scalable evaluation and generation of expressive behaviors grounded in human perception. © 2025 Copyright held by the owner/author(s)
...
