“Who’s There?”: Designing Sensor-Aided Wearable Assistive Technology for the Visually-Impaired
Approximately 4% of the world’s population is visually impaired, with 65% of them over age 50; and an estimate of 90% of them living in low and middle-income countries. While these numbers are shocking, the bigger challenge is the reduced confidence and life satisfaction that the visually impaired experience as a result of loss in independence and diminished sociability. This project tries to mitigate the problem of dependence using low-cost, intuitive, sensor-aided assistive technology. Interacting with visually-impaired users, we identified 3 specific problems: difficulty recognizing those in their surroundings; inability to proactively greet persons entering their social space; and not knowing if the person they are interacting with is within hearing range as they move around. While previous research in assistive technology for the blind has largely focused on enabling smoother navigation, there has been less focus on improving their social interactions. We employ a user-centered design approach and think-aloud protocol to gain insight into the user’s cognitive processes, comfort level and feelings while they are interacting with the device and performing various structured social interaction tasks. Furthermore, we use standard psychological instruments to measure changes in sociability, independence and technological comfort as the users use the device. We developed two distinct prototypes with several iterations of the design-thinking process. The first relied on a smart-phone to notify the user. While it successfully performed the tasks, it was too cognitively overwhelming, frustrating and exhausting for a blind user because of the phone’s many notifications. Therefore, it was an ineffective way of augmenting their perception. Our second prototype, and current solution, is threefold: building a smart environment; designing a single-purpose, wearable bracelet with sonifications and vibro-tactile communication; and creating a novel audio-haptic user interface. We evaluated this device and chose it as our current solution because it is a low-cost, low-energy, easy-to-use, intuitive device that was successfully used by potential users to identify and place potential interactors in their surroundings during usability testing and user feedback sessions. This project presents a foundation for designing more intuitive audio-haptic interfaces and devices for not only visually-impaired, but also for aging populations and sighted individuals. It proposes future research avenues overcoming current limitations, exploring long-term effects of the assistive device on the user’s well-being, and enabling customization to an individual user’s needs.