Richard Picot | UX Designer
Portfolio
IMG_banner_myway.jpg

My Way

MyWay

As part of the first Design Jam London Hackathon we were asked to design an app that would help the visually impaired reorient when making their way around a town or city. The app should have a focus on wearable technology.

 

Overview

PROCESS
Research, Sketching, Prototyping, Testing
TOOLS
Pen & Paper, Sketch, Keynote
PARTNERS
Team of Five
DURATION
1 day
 

Research

Reorient:
verb
to find one's position again in relation to one's surroundings.
 

Key Findings from SME

  • Audio plays a huge part in assisting the blind, but be careful not to overwhelm the user, a solution that requires headphones poses the risk of disconnecting them from the environment.
  • Haptic feedback is great but can easily be missed.
  • “Enhance and enable the experience, don’t be the experience.” People should still feel independent.
  • Support the journey.
  • Avoid busy UI.
 

Ideation

With our research in mind we set out to rapidly generate solutions. We ran a quick design studio; each sketching eight ideas in 10 minutes, presenting back to the group, then voting on favourites. These were then iterated and mapped to a prioritisation chart to identify the most feasible solution.
The solution we felt would best meet the user needs was an Apple Watch app that would speak what was surrounding them based on their location.

We really wanted to find a helpful way to assist only when necessary, rather than a step by step navigation system. For this we made the assumption that the people using our app would be embarking on journeys they have taken before but occasionally need a little support along the way.

We took some time to brainstorm features, we landed on three key bits of information that would be useful to help our user reorient:

  • What’s straight ahead?
  • What is around me?
  • Where have I been? Catering for a scenario where the user may feel lost, they can retrace their steps.

We chose to make this information the focus for what we would present to the user.

Click to enlarge

 
“We aim to help visually-impaired people find their way on their own by informing them where they are and where they’ve been.”
 

Speech As UI

Click to enlarge

We soon realised that these buttons were only the signifiers for the information we would then serve. We turned our focus to how we would best construct the spoken information.

 

Prototyping & Testing

We tried replicating real world use by placing someone in an environment they knew, blindfolding them and span them around. We then asked them to use the app to help them make their way to a predetermined location.

paper-watch

Iterations

  • The Straight Ahead Button was interpreted as an instruction and renamed to In Front of Me
  • Distance was more effective in steps
  • Around Me flow of information was reshuffled. Landmark should be spoken first in Around Me so that the user then can better identify the relevant information that follows
  • Add Across the Road when relevant
  • Landmark > (Across Road) > Orientation > Number of Steps
 

The Solution

 

Future Considerations

  • Smart updates of clock face on Around Me. If a user turns whilst items are being spoken the directions should be adjusted to compensate.
  • Ability to repeat information on the playing screen.
  • Adapt the app to working inside at places such as Malls using Magnetic Field Based Location Technology.
  • Explore the possibility of adapting the app to grow the user base to include cyclists and Alzheimer’s sufferers.