Audio sword

Screenless navigation for the blind

THE PROBLEM

Can we simplify smartphone navigation for visually impaired users?

According to the World Health Organization (WHO) 285 million people are Blind or Visually Imparied (BVI). The smartphone era has empowered people with information. But mobile applications haven't reached their potential among the BVI community due to complex navigation patterns. Though voice commands reduce the complexity of navigation to an extent, they induce social and privacy concerns. Audio sword explores a new mobile navigation experience for the BVI community using hand gestures and aural cues.

JUNE, 2015 - PRESENT

My Role

I lead UX Reseach and Experience Design for Audio Sword. Along with Dr. Davide Bolchini, I have done extensive user research to study the existing systems for the blind and visually impaired (BVI) users and their problems. After scoping the problem, I designed a new navigation system for the BVI users using hand gestures and aural cues.

THE JOURNEY

What are the current mobile navigation problems for BVI users?

I used user interviews, literature search and extensive desk research to answer the following questions

  • What do BVI users use the smartphone for?
  • What tools do they use to augment navigation?
  • What are their current problems?
  • What are some of the user scenarios?

Interviewing Imran from BOSMA Enterprises. Blind from birth, Imran taught himself computer skills.

Research Insights

It is difficult for BVI users to hold a phone in their hand navigating themselves to another physical location.
How might we decouple the mechanical interaction with the device so that BVI users can benefit from information on-the-go?

The mobile app navigation patterns are too complex for BVI users.
How might we make simplify mobile app navigation so that BVI users can use their apps more efficiently.

BVI users cannot use voice input or ouput frequently due to social and privacy concerns.
How might we create a quiet and handsfree form of input for BVI users so that their interactions with the phone are more discreet.

THE SOLUTION

Introducing Audio Sword

It combines hand gestures and aural cues to help users navigate mobile applications without having to take the phone out of the pocket. User can now search for a restuarant or call an Uber using hand gestures.

Video explaining the screenless access concept for BVI users.

Making the prototype

The MYO armband uses electromyography to detect hand gestures. The detected gestures are transmitted to the smartphone using low-energy bluetooth. These gestures are mapped to actions on the navigation tree. The user can listen to the feedback of his/her action through the bone conduction earphones. 

INTERACTION DESIGN

Hand gestures vocabulary and Earcons 

The user can use 5 hand gestures to navigate apps on the phone. These 5 gestures are easy to learn and perform. Each gesture is associated with an earcon to give instant feedback to the user.

Wave right

Moves the cursor to the next item in the menu.

Earcon

Wave Left

Moves the cursor to the previous item in the menu.

Earcon

Double Tap

Selects an item in the menu.

Earcon

Finger Spread

Acts as the back button of the phone.

Earcon

Fist

Provides contextual help to the user at any point in the navigation experience.

Earcon

How do we improve the user experience further?

Audio sword introduces the following 3 new concepts to help the BVI user navigate mobile apps more efficiently.

Binary aural navigation

The primary menu of applications is divided into 2 nodes to improve the efficiency of inner menu navigation. For example, the primary menu of the Yelp app is divided into 2 parts and placed on either side of the cursor.

Aural free flow

When the user selects a menu, the items in the list are read out automatically seperated by an earcon. At any point, the user can interfere and perform finer navigation through the hand gestures. This reduces the gesture fatigue as the number of gestures is reduced.

Contextual help

At any given point of time, the user can perform the "fist" gesture to get contextual help. This provides 2 types of information - "where are you? what can you do from here?". Since the system is aural and not visual, helping the users regain context was very important.

Twitter user flow

This is a sample userflow for a BVI user navigating twitter using all the components of Audio Sword.

 

Next Steps

I demonstrated the concept at BOSMA Enterprises. They are helping us recruit participants for the user study we are running. This study aims to check the conceptual fit of the system with BVI users.

The screenless access concept is efficient at browsing content. We are also working on a model to help BVI users input text using hand gestures.