AmplitudeArrow: On-the-Go AR Menu Selection Using Consecutive Simple Head Gestures and Amplitude Visualization
Yang Tian,
Youpeng Zhang,
Yukang Yan,
Shengdong Zhao,
Xiaojuan Ma,
Yuanchun Shi.
Published at
TVCG
2025

Abstract
Heads-up computing aims to provide synergistic digital assistance that minimally interferes with users' on-the-go daily activities. Currently, the input modalities of heads-up computing are mainly voice and finger gestures. In this work, we propose and evaluate the AmplitudeArrow (AA) technique designed for on-the-go AR menu selection to demonstrate that consecutive simple head gestures can also be an effective input modality for heads-up computing. Specifically, AA arranges menu icons into one/two row(s). To select a target icon, the user first makes their head yaw to pre-select the target icon or the column containing it and then makes their head pitch to make the arrow in the target icon expand until the arrow covers the target icon completely, i.e., the pitch amplitude surpasses the selection confirmation threshold. User studies indicated that AA demonstrated robust resistance to walking-caused head perturbation and external factors such as other people/obstacles, delivering high accuracy (error rate < 5%) and fast speed (< 1.5s per selection) when there were no more than six icon columns (twelve icons) distributed horizontally and evenly in a menu area with a horizontal visual angle of 43∘.