NYC MTA - Navigate

unnamed.jpg

OVERVIEW

Navigate is a mobile wayfinding app that assists blind and visually impaired subway riders with orienting themselves within a NYC subway station. It incorporates both Bluetooth Low Energy (BLE) Smart Sensor and Wi-Fi technologies. The BLE beacons talk to a user's device and deliver audio messages regarding their surroundings - such as which platform the user is on, where the nearest exits are, and general directions from point A to point B inside the station. The Wi-Fi technology provides up-to-the minute train arrival times and service change advisories. The app is designed to work in conjunction with Apple's built-in voiceover accessibility features. Navigate was developed as a personal passion project of mine and went on to be a winner of NYC's MTA App Quest Hackathon.  

PROJECT INCEPTION

This project began one day when I was on the subway and noticed a blind man who seemed lost.  I approached him and asked if he needed help. He replied that he wasn't sure which platform we were on.  I told him we were on the A line. He needed the F, so I offered to guide him there. As we walked I asked him how he typically approached navigating subway stations. Did he use braille? He laughed and said that the braille signs were not terribly helpful: "Nobody wants to feel around on a dirty wall searching for braille." He typically relied on strangers or his own internal sense of direction to navigate NYC's complex subway system. This struck me as wrong. Why hadn't the MTA found a better way for people with visual impairments to wayfind the system? I decided to do a little research. 

DISCOVERY PHASE

I looked online for any existing systems that addressed this problem and discovered that there was going to be a talk at a nearby college on new wayfinding technologies for people with disabilities. I attended the following week. There were a number of presenters showcasing various wayfinding systems at the event. However, these systems required costly sensor technology and/or wearable devices that were cumbersome to use and cosmetically unattractive.  A few smartphone apps were presented, but they only worked in outdoor settings. While I was at the event I met a tech savvy man, Mike, who had lost his sight to glaucoma. We shared an equal desire for finding a more accessible solution for indoor navigation (particularly within the subway) and agreed that developing a smartphone app using sensor technology seemed like the most logical approach. We decided to team up and design an app together. 

To better understand how Mike typically navigates the subway system and where any pain points existed, I accompanied him on a subway ride and filmed him for later reference. A few of the pain points uncovered was understanding which side of the platform was the local or express, and how to get from one train line to another from within a station. Mike also expressed frustration with service changes notifications that might impact his commute, since these were typically posted on paper that a blind individual couldn't read, or announced on poor quality public address systems that were hard to hear.

In this next phase, I enlisted a few design and development friends to join our team. We met with Mike in a subway station to play out a few of the user scenarios and examine the pain points we had identified during the discovery phase. Based on this exercise we began sketching out user journeys. Also, we decided that Bluetooth Low Energy beacons might be the best solution for our system, considering that most smartphone devices already had built-in sensor support.  

DESIGN PHASE

Wireframing and Prototyping

Next we began working with Mike to understand how he interacted with apps on his smartphone and which ones he felt were the most user-friendly for the visually impaired. He demonstrated how he used apps in conjunction with his device's voiceover gesture-based screen reader. Based on these learnings, we started sketching out user flows, being mindful of how we could best optimize the experience for voiceover accessibility. 

Mike wanted the app to be easy to use, so that he didn't have to fumble around with his phone too much while navigating a busy subway station. In consideration of this, we wanted to streamline the app to maintain focus one task at a time. This way the user could retrieve the information they needed in just a few taps or with voice command. To address Mike's issue with service change notifications, we decided to incorporate the MTA's real-time train arrival and service change data feeds into the app. We went through a few different design approaches and layouts before settling on one that Mike approved of. Next we perfected the UI for this design and worked with the developer to create a rough but functional prototype for user testing.  

navigate_wires_900.png

User Testing

Once our prototype was complete, we invited Mike and a few of his visually impaired friends to test and validate our designs. From this test we learned which aspects of our design were working and uncovered gaps where improvements were needed. One technical challenge that surfaced was that the Bluetooth beacon's signal needed be stronger in certain instances, and that depending on where beacons were placed in the station, they occasionally conflicted with train signals and other interference within the environment. It became clear that we needed more resources and technical assistance from the MTA itself. Therefore, we decided to approach the MTA with our project for guidance.  

Joining the MTA & AT&T Hackathon

We met with a few MTA executives who encouraged us to join the upcoming MTA and AT&T App Quest Hackathon. They felt that gaining greater access to their stations network, along with additional technical resources and guidance, could help accelerate our progress. At the hackathon we teamed up with a company that had developed a Bluetooth beacon and indoor positioning platform that the MTA was already utilizing for train location data. Their beacons were far more robust than the BLE beacons we had been using and we were able to easily integrate our system with their SDK. Once we got the system up and running, we did another round of user tests and went through a few design iterations based on the feedback. We then presented our final designs at the hackathon and were  one of the winning teams.   

MTA_Hackathon_Win.jpg