Accessible Navigation


For people with disabilities, navigation in public—to and from school, the workplace, the grocery store, and government offices—is a hard-won legal right, and yet still replete with challenges. An airport may indicate the direction to terminals through visual-only signage, making it inaccessible to blind and low vision travelers. Or, pathways that avoid stairs may be difficult to find, especially when elevators or ramps are out of service for maintenance. To address this, accessibility researchers have been studying navigation technologies for people with disabilities for decades, focusing primarily on narrow use cases (think: navigating to a door) within a single disability population (think: people who are blind). Yet, we know little about holistic wayfinding needs, and how they manifest across disabilities. INsite’s approach to accessible navigation has therefore been to understand end-to-end navigation in conversation with people with a variety of disability identities.

Across several interview studies, design workshops, and co-design sessions, our studies find that navigation route preferences are often shared across disabilities. So, a blind white cane user may desire to avoid carpeted areas to improve acoustics for navigation, while power and manual wheelchair users may desire the same, to avoid running down their battery or tiring out their arms. Of course, sometimes the navigation needs across disabilities diverge or are in tension. So, someone with a hearing disability may prefer to avoid routes with traffic sounds, which limit their functioning hearing, while well-trafficked streets provide utility to people with vision disabilities who use their hearing to walk a straight line. We also found that successful technical navigation solutions must account for social comfort and safety while being affordable and inconspicuous––in other words, they need to be socially accessible.

One direction we have taken this work is to adapt voice assistant technologies to navigation problems. People with varying disability identities found navigating in large, unfamiliar indoor environments particularly challenging, and shared that human assistance is hard to find. A voice assistant––optionally accessed through a wearable device that can be mounted to the wrist a wheelchair arm, a stroller, or a backpack strap—may be useful for on-demand navigation information. As one example, the assistant could pair a cellphone to provide real-time gate and flight departure information. Leveraging multimodal interactions, it might use earcons (short, distinctive digital sounds) and tactons (short, distinctive vibrations) to indicate when navigating past important landmarks, like restrooms or elevators. The resulting system design, a voice assistant named Jamie, which we demo above.

Our current work in this area is exploring how pervasive technologies, like Google Maps, can be augmented to better serve people with vision disabilities. We look forward to updating you as we make progress.


(todo: add an impact statement here)

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.