|国家预印本平台
首页|What Sensors See, What People Feel: Exploring Subjective Collaboration Perception in Mixed Reality

What Sensors See, What People Feel: Exploring Subjective Collaboration Perception in Mixed Reality

What Sensors See, What People Feel: Exploring Subjective Collaboration Perception in Mixed Reality

来源:Arxiv_logoArxiv
英文摘要

Mixed Reality (MR) enables rich, embodied collaboration, yet it's uncertain if sensor and system-logged behavioral signals capture how users experience that collaboration. This disconnect stems from a fundamental gap: behavioral signals are observable and continuous, while collaboration is interpreted subjectively, shaped by internal states like presence, cognitive availability, and social awareness. Our core insight is that sensor signals serve as observable manifestations of subjective experiences in MR collaboration, and they can be captured through sensor data such as shared gaze, speech, spatial movement, and other system-logged performance metrics. We propose the Sensor-to-Subjective (S2S) Mapping Framework, a conceptual model that links observable interaction patterns to users' subjective perceptions of collaboration and internal cognitive states through sensor-based indicators and task performance metrics. To validate this model, we conducted a study with 48 participants across 12 MR groups engaged in a collaborative image-sorting task. Our findings show a correlation between sensed behavior and perceived collaboration, particularly through shared attention and proximity.

Yasra Chandio、Diana Romero、Salma Elmalaki、Fatima Anwar

计算技术、计算机技术远动技术

Yasra Chandio,Diana Romero,Salma Elmalaki,Fatima Anwar.What Sensors See, What People Feel: Exploring Subjective Collaboration Perception in Mixed Reality[EB/OL].(2025-04-22)[2025-05-25].https://arxiv.org/abs/2504.16373.点此复制

评论