VirtualGrasp: Leveraging Experience of Interacting with Physical Objects to Facilitate Digital Object Retrieval
Yukang Yan,
Chun Yu,
Xiaojuan Ma,
Xin Yi,
Ke Sun,
Yuanchun Shi.
Published at
ACM CHI
2018
Abstract
We propose VirtualGrasp, a novel gestural approach to retrieve virtual objects in virtual reality. Using VirtualGrasp, a user retrieves an object by performing a barehanded gesture as if grasping its physical counterpart. The object-gesture mapping under this metaphor is of high intuitiveness, which enables users to easily discover, remember the gestures to retrieve the objects. We conducted three user studies to demonstrate the feasibility and effectiveness of the approach. Progressively, we investigated the consensus of the object-gesture mapping across users, the expressivity of grasping gestures, and the learnability and performance of the approach. Results showed that users achieved high agreement on the mapping, with an average agreement score [35] of 0.68 (SD=0.27). Without exposure to the gestures, users successfully retrieved 76% objects with VirtualGrasp. A week after learning the mapping, they could recall the gestures for 93% objects.
Materials
Bibtex
@inproceedings{yukang20182, author = {Yan, Yukang and Yu, Chun and Ma, Xiaojuan and Yi, Xin and Sun, Ke and Shi, Yuanchun}, title = {VirtualGrasp: Leveraging Experience of Interacting with Physical Objects to Facilitate Digital Object Retrieval}, year = {2018}, isbn = {9781450356206}, publisher = {Association for Computing Machinery}, address = {New York, NY, USA}, url = {https://doi.org/10.1145/3173574.3173652}, doi = {10.1145/3173574.3173652}, abstract = {}, booktitle = {Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems}, pages = {1–13}, numpages = {13}, keywords = {gesture, mapping, object selection}, location = {Montreal QC, Canada}, series = {CHI '18} }