EyeMU Interactions: Gaze + IMU Gestures on Mobile Devices
Автор: Future Interfaces Group
Загружено: 2021-10-20
Просмотров: 6894
Описание:
As smartphone screens have grown in size, single-handed use has
become more cumbersome. Interactive targets that are easily seen
can be hard to reach, particularly notifications and upper menu bar
items. Users must either adjust their grip to reach distant targets, or
use their other hand. In this research, we show how gaze estimation
using a phone’s user-facing camera can be paired with IMU-tracked
motion gestures to enable a new, intuitive, and rapid interaction
technique on handheld phones. We describe our proof of-concept
implementation and gesture set, built on state-of-the-art techniques
and capable of self-contained execution on a smartphone. In our
user study, we found a mean euclidean gaze error of 1.7 cm and a
seven class motion gesture classification accuracy of 97.3%.
Citation:
Andy Kong, Karan Ahuja, Mayank Goel, and Chris Harrison. 2021. EyeMU Interactions: Gaze + IMU Gestures on Mobile Devices. In Proceedings of the 2021 International Conference on Multimodal Interaction (ICMI '21). Association for Computing Machinery, New York, NY, USA, 577–585. DOI:https://doi.org/10.1145/3462244.3479938
Повторяем попытку...
Доступные форматы для скачивания:
Скачать видео
-
Информация по загрузке: