📲 Just reach out. Use APP – dC!

Application Screens

TIMELINE: March 2020 – Ongoing

ROLES: Interaction Designer, Accessibility Designer

According to the WHO website, globally, it is estimated that at least 2.2 billion people have a vision impairment or blindness, of whom at least 1 billion have a vision impairment that could have been prevented or has yet to be addressed. The number of individuals who are blind, using technology to assist them, are relatively few in numbers as compared to the total number. These technologies include mobile applications, smart walking sticks, voice assistants, and so on., Yet, most of these applications are specific to a function and I was curious to know and understand if there can be a mobile application that can encompass some of the major functionalities into one application, drawing inspiration from the products and prototypes in the market today.


Given that individuals who are blind are so accustomed to the white cane, can one mobile application really solve some of the overarching issues that persist in their lives instead of them accessing one application per function?


I decided to use the double diamond method to come up with solutions for the prototypes. Prior knowledge of agile methodologies only adds to prototypes being developed iteratively. Additionally, I began to dwell deeper into the domain to understand the users, to generate empathy, and ensure that the prototype is well worth, if and when prototype hits the markets eventually.



First things First… Research.

Curious about the type of users who might use the application, the type of applications in the market, the functionalities that each product offers, and ultimately, what are the KPIs associated with the application under construction. To get a stronghold on the following points, I decided to explore.


WELL, Who are the users? – Let us assess who the prospective users of the system are. Individuals who are blind, and connected to technologies are the people who are perceived to use this system. IS IT ONLY FOR THEM? Of course not, the sighted community who are willing to help people who are blind also form an integral part of the system.

Before hitting the solution button, understanding the users’ mental model is of paramount importance. The user, being the biggest asset in UX, needs to be at the forefront of the design to get the most of the product.

I knew right from the beginning that developing anything would require a lot of patience, a mature understanding of what the community as a whole goes through, and giving them the confidence of “Yes, I will be the one to connect you to technology”. To understand the users, I went back to my home town and reached out to a blind school. They were extremely generous and allowed me to conduct multiple interviews at the school premises. The participants? the students.


I ran through numerous websites and reached out to people who have used a device or application(s). The information gave me an insight into the pros and cons of each application, what each application provides in detail, what are some of the gaps seen, and how these gaps can be handled by my prototype.

THE USER INTERVIEWS – semi-structured

I have always believed that I can make people feel comfortable to talk in any given circumstance. Before the interviews were conducted, I ran through a list of questions that need to be asked after days of rigorous discussions. Additionally, semi-structured interviews were chosen because participants should have more freedom while communicating and thereby we get more reliable and qualitative data.

I do not want to use the earphones at all. I have to concentrate on the road and wearing earphones may deny me the chance to listen to something important. The application should be such that there is no pressure to use the earphone

I would like the mobile application to tell me about the position of the bus stops, traffic in the area and so on..

I use google maps a lot. I also use Lazarillo to help me within the walls of the city. I like applications where the functions are easily visible

I used qualitative codes to obtain high-level themes and draw effective conclusions from the interview data. The overarching theme was related to the difficulty of using technology in conjunction with the white cane; In places where environmental noise overshadows the sound made by the phone. This is a situation that they want to avoid.


As this project is purely for individuals who are blind, I decided to read through the different guidelines of accessibility (Apple, Android, and the WCAG). This enabled the collection of all the thoughts related to my sketches of the possible prototype with the addition of the pointers related to accessibility.

https://developer.apple.com/design/human-interface-guidelines/accessibility/overview/user-interaction gave me a direction to structure my design.



Among the other things, they want to see applications that give them the presence of sidewalks, the positioning of intersections, an emergency assistance button, and a scanner of handwritten data. Additionally, several application-specific pointers were noted down and eventually referenced to a User Persona to get an idea of how the application will be shaped in the future.

Persona of Natasha

The Affinity diagram helped me in the process of mapping ideas and thoughts. I went over to the interviews and picked out one of the participants and created an affinity diagram for that user. The Affinity diagram helped me structure my thoughts and organize the content. I used the steps mentioned on this website to create them. I created similar Affinity Diagrams for other participants and grouped all the similar ones into a common pool.

Affinity diagram for a participant.

Storyboards changed my perception of the functionalities that can be implemented in my project. I realized that my application cannot have all the functionalities that I desire because of constraints indicated by the storyboards. Having a story and a script made me realize that a number of things can be added/deleted with respect to specific scenarios.

Not everything has to be pretty.

Storyboard 1
Storyboard 2

📜 The point where the focus is on what can make an impact 💥 and what doesn’t…

After analyzing the data collected, drawing themes and coming up with the do’s and the don’ts post brainstorming, I am now confident to build on with the designs. The design should be accessible, impactful, and needs to draw on the user’s energy so that they form habits and come back to the application again and again. This led me to the point in the process to ask the all-important “HMV Question”


Having done all of the background research, synthesizing the data, I stopped to briefly think about framing the “HMW” or the “How Might We” question. I had a series of questions, which were converted to the HMW questions to help direct the design phase of the process, but I chose the one that fit the narrative the best.

How might we make an application that encompasses functionalities like video call, navigation through maps and connecting to other devices in the most accessible way possible?


With the HMW question firmly on my mind, I set about making some basic sketches on index cards, with each card becoming a screen.


I love to put my thoughts on paper so I decided to go ahead with the pen-paper route. Any solution that comes to mind has to hit the white sheet first.

Scaled images of the application under consideration.

I used Balsamiq to generate the wireframes of the skeletal structure of the application. Balsamiq Mockups is a great tool with a low learnability curve and relative ease of use.

First pages.

I took inspiration from the application “Be my eyes”, offering a different UI for sighted and the blind.

The need to obey Fitts Law is of paramount importance here as people who use screen readers require ease of access in terms of scrolling and button press.


This page is the key component in the application and has 4 pivotal functions:

  • Ask for Assistance
  • Google Maps
  • Video Assistance
  • Connect to devices (Explicitly)

Function Screens
normal assistance

For the usage of maps, I made use of a simple design involving existing features of talking to the voice assistant as well as a search bar to aid a user expert in typing.

video assistance
Connecting to nearby devices (audio output mainly)

Some of the users of the application may be drawn towards using a Bluetooth headset to aid them while navigating. This, however, plays against them because it cancels the noise of the outside environment. An added functionality that can be leveraged at any time, should the user need it.


APP-dC is an application designed in such a way that the users are at the forefront of everything. They have their own section where they can choose the various tools at their disposal. This would include the editing of information, how they want their notifications, connection to Amazon Alexa, editing their preferences, adjusting their camera settings, and accessing the general settings.



I had to explore tools that give me voice output as part of the working prototype. I went ahead with Adobe XD as it is a good tool for voice prototyping. A lot of questions ran through my mind.

  • Unfamiliar with voice play-back.
  • Confirmation play-back is unknown.
  • Accuracy of the voice option.
  • Will the prototype be played on the web browser?

A note – Please use the swiping mechanism to proceed to further screens upon hearing the voice note. However, voice triggers are not supported within embedded prototypes, thus making it ineffective to play it below. Kindly use the mobile XD application for the best results. https://xd.adobe.com/view/0eddc7ae-8dc8-432b-698f-fc93b5b88bb9-78ec/


From trying to empathize with the blind community to putting all my might behind making a prototype that is accessible to them, I have come a long way. I feel that the project has made me a stronger person mentally. I was skeptical at the beginning of the project, but as time progressed, the belief instilled in me by the community made me realize how we are gifted to have everything at our disposal and use it with so much ease.

Along the way, I faced innumerable challenges. Everyone I spoke to had an opinion and a design idea. I went with my instincts, tried being in the shoes of a blind person. This helped me understand and develop a novel idea for the application. Connecting people on the app is similar to what happens in rideshare, and I hope that one day, even if my prototype does not hit the market, it stands as an inspiration for many designs to come.