Skip to main content

Research Repository

Advanced Search

Measuring interaction proxemics with wearable light tags

Montanari, Alessandro; Tian, Zhao; Francu, Elena; Lucas, Benjamin; Jones, Brian; Zhou, Xia; Mascolo, Cecilia

Measuring interaction proxemics with wearable light tags Thumbnail


Authors

Alessandro Montanari

Zhao Tian

Elena Francu

Profile Image

BENJAMIN LUCAS Benjamin.Lucas@nottingham.ac.uk
Research & Knowledge Exchangedevelopment Manager

Brian Jones

Xia Zhou

Cecilia Mascolo



Abstract

The proxemics of social interactions (e.g., body distance, relative orientation) in!uences many aspects of our everyday life: from patients’ reactions to interaction with physicians, successes in job interviews, to effective teamwork. Traditionally, interaction proxemics has been studied via questionnaires and participant observations, imposing high burden on users, low scalability and precision, and often biases. In this paper we present Protractor, a novel wearable technology for measuring interaction proxemics as part of non-verbal behavior cues with# ne granularity. Protractor employs near-infrared light to monitor both the distance and relative body orientation of interacting users. We leverage the characteristics of near-infrared light (i.e., line-of-sight propagation) to accurately and reliably identify interactions; a pair of collocated photodiodes aid the inference of relative interaction angle and distance. We achieve robustness against temporary blockage of the light channel (e.g., by the user’s hand or clothes) by designing sensor fusion algorithms that exploit inertial sensors to obviate the absence of light tracking results. We fabricated Protractor tags and conducted real-world experiments. Results show its accuracy in tracking body distances and relative angles. The framework achieves less than 6 error 95% of the time for measuring relative body orientation and 2.3-cm – 4.9-cm mean error in estimating interaction distance. We deployed Protractor tags to track user’s non-verbal behaviors when conducting collaborative group tasks. Results with 64 participants show that distance and angle data from Protractor tags can help assess individual’s task role with 84.9% accuracy, and identify task timeline with 93.2% accuracy.

Citation

Montanari, A., Tian, Z., Francu, E., Lucas, B., Jones, B., Zhou, X., & Mascolo, C. (2018). Measuring interaction proxemics with wearable light tags. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2(1), https://doi.org/10.1145/3191757

Journal Article Type Article
Acceptance Date Jan 1, 2018
Publication Date Mar 31, 2018
Deposit Date Mar 28, 2018
Publicly Available Date Mar 31, 2018
Journal Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Electronic ISSN 2474-9567
Publisher Association for Computing Machinery (ACM)
Peer Reviewed Peer Reviewed
Volume 2
Issue 1
DOI https://doi.org/10.1145/3191757
Keywords Human-centered computing→Ubiquitous and mobile computing systems and tools; Computer systems organization→Embedded systems; Face-to-face interactions, non-verbal behaviors, light sensing
Public URL https://nottingham-repository.worktribe.com/output/923140
Publisher URL https://dl.acm.org/citation.cfm?doid=3200905.3191757
Additional Information © ACM, 2018. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (Volume 2 Issue 1, March 2018) (http://doi.acm.org/10.1145/10.1145/3191757

Files





You might also like



Downloadable Citations