Enhancing Running Accessibility
RunSight enables people with low vision to run at night through augmented reality. The system uses see-through head-mounted displays to enhance runners' awareness by highlighting their guide's position and potential obstacles. Our study demonstrated that visually impaired participants could run distances of at least 1km with RunSight, while none could participate in guided running in dark conditions without it. This research shows how AR technology can make previously inaccessible activities possible for people with visual impairments.
Related Publications:
Conversational Interaction in Indoor Environments
We introduce conversational localization, a sensorless approach that determines users' indoor positions through natural language dialogue. The system engages users in conversation about their surroundings, extracting spatial information from descriptions like visible landmarks, signage, and architectural features to estimate their location on existing floor maps. Our modular architecture combines entity extraction from floor plans, dynamic utterance processing, an intelligent conversational agent that strategically elicits locational information, and visibility-based spatial reasoning algorithms. Field studies across 10 indoor locations demonstrated sub-10-meter accuracy at 80% of test sites, with the system successfully processing 800 natural language descriptions of unfamiliar spaces. This approach eliminates the need for dedicated positioning infrastructure while enabling location-based services in complex indoor environments where traditional methods fail.
Related Publications:
- Sheshadri, S. and Hara, K. (2024) Conversational Localization: Indoor Human Localization through Intelligent Conversation
- Sheshadri, S., Cheng, L., and Hara, K. (2022) Feasibility Studies in Indoor Localization through Intelligent Conversation
Other Work
See our past and ongoing work on my publication page: https://kotarohara.com/publications