📚

Understanding the Feasibility of Auditory Hand-Steering Guidance for Blind and Low-Vision People

Authors

Yuki Abe, Rose Xin Lin, Kotaro Hara, Daisuke Sakamoto

Publication

CHI '26: Proceedings of the 2026 CHI Conference on Human Factors in Computing Systems

Article No.: 467, Pages 1 - 16

https://doi.org/10.1145/3772318.3790782

Published: 13 April 2026

Abstract

Everyday tasks like hand-washing and tea-making require people to steer their hands to use tools, navigating their hands to reach targets while avoiding hazards. Hand-steering becomes challenging when one cannot visually recognize if their hand is approaching the target and is away from hazards. Currently, no practical technological solutions support blind and low-vision (BLV) individuals’ hand-steering. We designed and developed two auditory hand steering guidance methods: VERBAL and Follow-Your-Finger (FYF). VERBAL uses spoken directional instructions, while FYF uses sonification to guide hand-steering. We conducted a user study with 12 BLV participants to evaluate the feasibility of the methods in supporting hand-steering. VERBAL lacked precision, 24.6% error rate for one of the easiest conditions, but FYF showed promise, achieving 4.17% error rate for the same condition. Among the six participants who preferred FYF, the error rate was 1.39%. The results demonstrate the feasibility of auditory hand steering guidance for BLV individuals.

Video

Paper

Understanding the Feasibility of Auditory Hand-Steering Guidance for Blind and Low-Vision People