🤔 APP – DC – The concept application for individuals who are blind
TIMELINE: March 2020 – Ongoing
According to the WHO website, globally, it is estimated that at least 2.2 billion people have a vision impairment or blindness, of whom at least 1 billion have a vision impairment that could have been prevented or has yet to be addressed. The number of individuals who are blind, using technology to assist them, are relatively few in numbers as compared to the total number. These technologies include mobile applications, smart walking sticks, voice assistants, and so on., Yet, most of these applications are specific to a function and I was curious to know and understand if there can be a mobile application that can encompass some of the major functionalities into one application, drawing inspiration from the products and prototypes in the market today.
As a part of my Master’s research, I am looking at the different ways in which participatory design can be incorporated with accessibility and design for people who are blind by using the data collected by investigating the navigational habits of people in low- and medium-income countries. This is done with a view of eventually connecting them to technology via the user-centered design process. Hence I decided to combine my research, my passion for accessibility and design to come up with a mobile application that can be useful in situations where they normally require increased assistance. The process coupled with passion is what led to the prototype as seen today.
ROLES: Interaction Designer, Accessibility Designer
The Team : Solo Project
Given that individuals who are blind are so accustomed to the white cane, can one mobile application really solve some of the overarching issues that persist in their lives instead of them accessing one application per function?
I knew that I was in for a big challenge to find out what was important and what was not for such a project. I wanted to know what kind of applications individuals who are blind use for different purposes. Thankfully, for the other project (Prototype for runners who are blind), we had done an extensive background check on the various applications that exist for such people and I only needed to analyze to get more insights. Before that, I needed a process in place to get me through to the finish line. After much thought, I narrowed in on the process I wanted to follow.
I decided to use the double diamond method to come up with solutions for the prototypes. Prior knowledge of agile methodologies only adds to prototypes being developed iteratively. Additionally, I began to dwell deeper into the domain to understand the users, to generate empathy, and ensure that the prototype is well worth, if and when prototype hits the markets eventually.
First things First… Research.
Curious about the type of users who might use the application, the type of applications in the market, the functionalities that each product offers, and ultimately, what are the KPIs associated with the application under construction. To get a stronghold on the following points, I decided to explore.
WELL, Who are the users? – Let us assess who the prospective users of the system are. Individuals who are blind, and connected to technologies are the people who are perceived to use this system. IS IT ONLY FOR THEM? Of course not, the sighted community who are willing to help people who are blind also form an integral part of the system.
Before hitting the solution button, understanding the users’ mental model is of paramount importance. The user, being the biggest asset in UX, needs to be at the forefront of the design to get the most of the product.
I knew right from the beginning that developing anything would require a lot of patience, a mature understanding of what the community as a whole goes through, and giving them the confidence of “Yes, I will be the one to connect you to technology”. To understand the users, I reached out to individuals who are blind. Before that, I had some exploration to do and revisited some old literature survey and product analysis.
Existing Product Analysis
I ran through numerous websites and reached out to people who have used a device or application(s). The information gave me an insight into the pros and cons of each application, what each application provides in detail, what are some of the gaps seen, and how these gaps can be handled by my prototype.
THE USER INTERVIEWS – semi-structured
I have always believed that I can make people feel comfortable to talk in any given circumstance. Before the interviews were conducted, I ran through a list of questions that need to be asked after days of rigorous discussions. Additionally, semi-structured interviews were chosen because participants should have more freedom while communicating and thereby we get more reliable and qualitative data. 10 people were interviewed on the same. Notes were taken in tandem.
I do not want to use the earphones at all. I have to concentrate on the road and wearing earphones may deny me the chance to listen to something important. The application should be such that there is no pressure to use the earphone
I would like the mobile application to tell me about the position of the bus stops, traffic in the area and so on..
I use google maps a lot. I also use Lazarillo to help me within the walls of the city. I like applications where the functions are easily visible
I used qualitative codes to obtain high-level themes and draw effective conclusions from the interview data. The overarching theme was related to the difficulty of using technology in conjunction with the white cane; In places where environmental noise overshadows the sound made by the phone. This is a situation that they want to avoid.
As this project is purely for individuals who are blind, I decided to read through the different guidelines of accessibility (Apple, Android, and the WCAG). This enabled the collection of all the thoughts related to my sketches of the possible prototype with the addition of the pointers related to accessibility.
https://developer.apple.com/design/human-interface-guidelines/accessibility/overview/user-interaction gave me a direction to structure my design as I focussed on building an IOS application prototype.
❗ Importance of progressive disclosure
I am extremely mindful to employ progressive disclosure. Given that we are dealing with individuals who are blind, we really cannot push too much information on a single screen. The content has to be relatively easy to identify and understand. Progressive Disclosure would allow me to push all the information that was not necessary to screens at the back that would only appear when needed. (Based on the functionalities presented to the user.
Among the other things, they want to see applications that give them the presence of sidewalks, the positioning of intersections, an emergency assistance button, and a scanner of handwritten data. Additionally, several application-specific pointers were noted down and eventually referenced to a User Persona to get an idea of how the application will be shaped in the future.
In all of this, it was evident that such users required extra attention to detail. I also created an empathy map for the persona above. This enabled me to think along the lines of the target users and aided me in the design process. However, coming up with an empathy map for this user group required a lot of time and effort because details needed to be carefully crafted into it.
The Affinity diagram helped me in the process of mapping ideas and thoughts. I went over to the interviews and picked out one of the participants and created an affinity diagram for that user. The Affinity diagram helped me structure my thoughts and organize the content.
Storyboards changed my perception of the functionalities that can be implemented in my project. I realized that my application cannot have all the functionalities that I desire because of constraints indicated by the storyboards. Having a story and a script made me realize that a number of things can be added/deleted with respect to specific scenarios. The storyboards helped me synthesize the content and match my thinking with the constraints.
Not everything has to be pretty.
📜 The point where the focus is on what can make an impact 💥 and what doesn’t…
After analyzing the data collected, drawing themes and coming up with the do’s and the don’ts post brainstorming, I was confident to build on with the designs. The design should be accessible, impactful, meaningful and needs to draw on the user’s energy so that they form habits and come back to the application time and again. This led me to the point in the process to come up with the “How Might We” Questions
THE “HMW” QUESTION.
Having done all of the background research, synthesizing the data, I stopped to briefly think about framing the “HMW” or the “How Might We” questions to help direct the design phase of the process.
How might we make an application that encompasses functionalities like video call, navigation through maps and connecting to other devices in the most accessible way possible?
How might we integrate potential sighted users with the individuals who are blind to provide one to one communication?
How might we ensure that the prospective users feel safe while using the application?
With the HMW question firmly on my mind, I set about making some basic sketches on index cards, with each card becoming a screen.
ACCESSIBILITY 💡- revisited
Before I started the design phase of this project, I wanted to make use of the accessibility guidelines as much as possible. I wanted to have relatively bigger font, bigger buttons and reduced distance between buttons to improve the speed of movement. Another key factor was the incorporation of lesser text on the screen so that scroll and screen read is easier. I wanted to ensure that through my design, the functions are visible and there is a match between the system and the real world. It may only be a prototype application but the notion of “How much can this application help individuals who are blind” was something that was always on my mind.
Low Fidelity Sketches
I love to put my thoughts on paper so I decided to go ahead with the pen-paper route. Any solution that comes to mind has to hit the white sheet first.
I used Balsamiq Mockups to generate the wireframes of the skeletal structure of the application. Balsamiq Mockups is a great tool with a low learnability curve and relative ease of use.
I took inspiration from the application “Be my eyes”, offering a different UI for sighted and the blind.
The need to obey Fitts Law is of paramount importance here as people who use screen readers require ease of access in terms of scrolling and button press.
This page is the key component in the application and has 4 pivotal functions:
- Ask for Assistance
- Google Maps
- Video Assistance
- Connect to devices (Explicitly)
For the usage of maps, I made use of a simple design involving existing features of talking to the voice assistant as well as a search bar to aid a user expert in typing.
Connecting to nearby devices (audio output mainly)
Some of the users of the application may be drawn towards using a Bluetooth headset to aid them while navigating. This, however, plays against them because it cancels the noise of the outside environment. An added functionality that can be leveraged at any time, should the user need it.
APP-dC is designed in such a way that the users are at the forefront of everything. Progressive disclosure takes paramount importance. They have their own section where they can choose the various tools at their disposal. This would include the editing of information, how they want their notifications, connection to Amazon Alexa, editing their preferences, adjusting their camera settings, and accessing the general settings.
High Fidelity Mockups
From the structural designs in the medium-fidelity mockups, I decided to pursue the high-fidelity screens. I kept in mind some of the guidelines specified in the WCAG. This was particularly important as the whole project stems from accessibility and making an accessible application for people who are blind.
I had to explore tools that give me voice output as part of the working prototype. I went ahead with Adobe XD as it is a good tool for voice prototyping. A lot of questions ran through my mind.
- Unfamiliar with voice play-back.
- Confirmation play-back is unknown.
- Accuracy of the voice option.
- Will the prototype be played on the web browser?
A note – Please use the swiping mechanism to proceed to further screens upon hearing the voice note. However, voice triggers are not supported within embedded prototypes, thus making it ineffective to play it below. Kindly use the mobile XD application for best results. https://xd.adobe.com/view/003e47c3-2a70-47ce-a40c-7989f1c48a88-556d/
From trying to empathize with the blind community to putting all my might behind making a prototype that is accessible to them, I have come a long way. I feel that the project has made me a stronger person mentally. I was skeptical at the beginning of the project, but as time progressed, the belief instilled in me by the community made me realize how we are gifted to have everything at our disposal and use it with so much ease.
Along the way, I faced innumerable challenges. Everyone I spoke to had an opinion and a design idea. I went with my instincts, tried being in the shoes of a blind person, empathizing with them. This helped me understand and develop a novel idea for the application. I went with the idea that connecting people on the app is similar to what happens in rideshare, and I hope that one day, even if my prototype does not hit the market, it stands as an inspiration for many designs to come.