Enhancing Sense of Embodiment through Automated Rubber Hand Illusion and Its Assessment
项目来源
项目主持人
项目受资助机构
项目编号
立项年度
立项时间
研究期限
项目级别
受资助金额
学科
学科代码
基金类别
关键词
参与者
参与机构
1.Large Language Models as Perceivers of Dynamic Full-Body Expressions of Emotion
- 关键词:
- Behavioral research;Human computer interaction;Affective Computing;Body motions;Full body;Human emotion;Human judgments;Human motions;Human perception;Language model;Large language model;Motion description
- Liu, Huakun;Cheng, Miao;Wei, Xin;Dollack, Felix;Schneider, Victor;Uchiyama, Hideaki;Kitamura, Yoshifumi;Kiyokawa, Kiyoshi;Perusquia-Hernandez, Monica
- 《27th International Conference on Multimodal Interaction, ICMI 2025》
- 2025年
- October 13, 2025 - October 17, 2025
- Canberra, ACT, Australia
- 会议
Human emotion expressed through body motions is often interpreted differently by different perceivers. We explored the use of large language models (LLMs) to simulate humans’ judgment variability and model the distribution of perceived emotions from body motion. Given textual motion descriptions derived from a multiactor dataset covering diverse emotional contexts, we prompted LLMs to generate emotion probability distributions. We compared LLM-generated outputs with human perception distributions and performer-intended emotion labels. The model showed strong perceptual alignment with human perceivers and achieved an accuracy of 63.2% in identifying the intended emotions, comparable to that of human perceivers. We further investigated the effects of context and scoring constraints, showing that both factors influence the scoring of the model. Our findings suggest that LLMs can serve as effective proxies for modeling emotion perception distributions from motions, supporting scalable evaluation and generation of expressive behaviors grounded in human perception. © 2025 Copyright held by the owner/author(s)
...
