Authors
Yuki Abe, Keisuke Matsushima, Kotaro Hara, Daisuke Sakamoto, Tetsuo Ono
Publication
TBD
Abstract
Dark environment challenges low-vision (LV) individuals to en- gage in running by following sighted guide—a Caller-style guided running—due to insufficient illumination, because it prevents them from using their residual vision to follow the guide and be aware about their environment. We design, develop, and evaluate Run- Sight, an augmented reality (AR)-based assistive tool to support LV guide’s position with AR-based visualization. To demonstrate Run- Sight’s efficacy, we conducted a user study with 8 LV runners. The results showed that all participants could run at least 1km (mean = 3.44 km) using RunSight, while none could engage in Caller- style guided running without it. Our participants could run safely because they effectively synthesized RunSight-provided cues and information gained from runner-guide communication. individuals to run at night. RunSight combines see-through HMD and image processing to enhance one’s visual awareness of the surrounding environment (e.g., potential hazard) and visualize the guide’s position with AR-based visualization. To demonstrate Run- Sight’s efficacy, we conducted a user study with 8 LV runners. The results showed that all participants could run at least 1km (mean = 3.44 km) using RunSight, while none could engage in Caller- style guided running without it. Our participants could run safely because they effectively synthesized RunSight-provided cues and information gained from runner-guide communication.
Paper
TBD