Improving accessibility on the UCSD campus.
Myself and a group of engineers partnered with the UCSD Office for Students with Disabilities (OSD) in order to develop a tactile map for the visually impaired.
Scope: User Research • Project Management • Ethnography • Complex Systems
Timeline: 7 weeks, then passed down to another engineering team.
The Original Request
The idea was for a 3D tactile map to be created and displayed in the visitor center and OSD office for a diverse group of visually impaired visitors, and the whole UCSD community.
The Client and Point Of Contact
A student’s college career is completely altered when dealing with a disability, impeding them from completing day to day activities.This is why the Office for Students with Disabilities (OSD) is so essential to the community as they work with students to review medical documentation and determine reasonable academic and non- academic accommodations whenever they are in need. They reached out to our team and asked us to begin developing a solution for the request above.
Jimmy Cong, pictured above, is a recent graduate from UCSD. He is a ADA Access Specialist who evaluates campuses for physical access and reviews products, websites, and course content for electronic accessibility. Likewise, he served as our point of contact and one of our primary users for researching and user testing.
User Research Methods
In order to better understand a blind or visually impaired individual in terms of their goals, needs, and actions in context, we shadowed Jimmy as he performed assigned tasks on campus.
We had Jimmy traverse a path he was familiar with to gauge his efficiency, locational memory, and sensory abilities. We then had him traverse a very different terrain that he was entirely unfamiliar with so that we could compare the two.
01. Satellite Mapping Software Unreliable
One of the biggest concerns is that satellite mapping, such as Google Maps, is not always reliable. Even when it is reliable, it only succeeds in bringing the user to the general location, but cannot assist visually impaired users with finding the front door/entrance of their location.
02. Hard To Find Final Destination
Once visually impaired users confirm that they are in the proximity of their destination, it can often take up to 5 minutes for them to find where they actually want to be.
03. Internal Representation Concerns
The auditory descriptions and instructions from available smart phone softwares don't help create a mental representation of new environments for users that can't see the app interfaces.
Modified Problem Statements
Following our user research and need finding, we decided to make an alteration to the original request given to us by the OSD Office. This modification divided the project into a focus on the first 10% of the user journey and the last 10%.
Visually impaired visitors and students are in need of a simple, but intuitive medium to internalize the geographical layout of the UCSD campus.
Visually impaired visitors and students lack a robust method of locating the entrance of buildings that they wish to enter.
After multiple user interviews over the phone and the in person testing with Jimmy, we began brainstorming concepts and narrowing down ideas via the traditional Human Centered Design Process.
We presented our updated problem statements and various ideas from our brainstorming session to the OSD office, Jimmy, and a couple more visually impaired users. Their feedback helped shape the direction we ultimately took, which can be seen in the image below.
Prototyping The First 10%
We made multiple iterations during a prototyping session, which ultimately lead to the model below that uses textures to distinguish trails, braille stickers to name buildings, and uses distinct indicators, such as a uniform star, to communicate to users where the main entrance of a building is.
We used SketchUp to design the map below, coding out about 1/4 of the main UCSD campus as a proof of concept.
Prototyping The Last 10%
To inform our design for a solution to the struggles of locating building entrances, we researched the two most popular software solutions: Microsoft Soundscape and AIRA. Likewise, we looked into a USF project team attempting to solve a similar problem to ours.
Soundscape will place an audio beacon on one’s destination so that they could hear it in 3D. In this way, they can build a richer awareness of their surroundings.
It allows users to personalize a mental map of the space, and also gives the user a cue that his/her is getting close to the destination.
Free on the IOS Store.
The disadvantage is it does not give very detailed information during navigation. For example, it does not tell you when it's safe to cross a road. Hence, you will still need a guide dog.
AIRA is a wearable device where a visually impaired user can call a live agent to receive assistance with navigation.
The disadvantage for this service is that users feel dependent and it needs good cellular coverage.
With the research conducted above and the information provided from earlier user testing and feedback sessions, we designed a remote speaker to solve for the last 10%.
This remote speaker would allow for a complementary app that allows for users to ping their locations via wifi, cellular, or bluetooth connection, making this solution the most robust one possible. Below are rough drafts for the complementary app.
Complementary App Prototype
Research and testing allowed us to discover how visually impair users navigate smart phone interfaces. They almost always use some variation of a screen reader, which will read out the elements of a page as they feel around with their fingers. This allowed us to arrange an information hierarchy that was well tuned for this specific task.
Feature Unlock Screen
The remote speaker pinging system would be an added on feature to the existing UCSD app and would need to be enabled by a worker at the OSD office so that it couldn't be spammed by every UCSD student.
Remote Speaker Pinging Screen
Once enabled, the location pinging screen would be accessible for visually impaired users. This screen is focused on information hierarchy and has little focus on the visual appearance. Users can quickly have nearby locations read out to them or request for a specific location to be pinged. Once pinged they would hear the remote speakers assigned noise.
At the end of 7 weeks, our team concluded the app design with the iterations above. Our final work also included all the previously seen on this page.