Just reach out. Use APP – dC!

The Walking stick is now the inspiration for a digital Twist.

TIMELINE: September 2019 – May 2020

It all started with small talk between a visually impaired person and myself. He made me see things that I never saw before… It left a mark, an indelible mark and I decided that I would do something for them. I decided that I would do a walking stick. What started out as thought became a reality. A reality made sure through beeping. With the hardware out of the way, I have decided to add a little digital twist and come up with an interactive prototype.

A precursor…

The Hardware model

Often, the feedback to the user is either in the form of an auditory output or in the form of vibrations or both. This walking stick uses auditory output – A BUZZER. As the walking involves bumping into obstacles, a distance to sense the obstacle was set. A Raspberry Pi 3 and a Proximity sensor come into use to detect the obstacle in the path of the person. The coding is done in Python. Once mounted on a walking stick, the system is powered by a Power Bank.

I knew exactly what the walking stick does so I started thinking about tying the incoming digital application prototype with the working model.

As the project was focused on the basic idea of obstacle detection, developing a working model was the last stage in a 5-month long project. Although the walking stick proved to be useful in an environment where there was minimal obstruction due to obstacles, the sound of the buzzer could not be heard in a noisy place.

A paper on the walking stick was accepted at the International Conference on Industrial Electronics and Electrical Engineering (ICIEEE) conference held in Raipur, India in October 2016. Now in 2020, 4 years later, the prototype combined with a thorough UX process is pushing this project to go digital .

ROLE: Interaction Designer


Given that the blind people are so accustomed to the white cane, how can their problems be alleviated by technology in the form of a mobile application without hampering the use of the white cane, keeping in mind the existing applications?


I decided to use the double diamond method to come up with solutions for the prototypes. Prior knowledge of agile methodologies only adds to prototypes being developed in an iterative manner. Additionally, I began to dwell deep into the user’s minds and understand them to ensure that the prototype is well worth; when it hits the markets, eventually.


Understanding the users’ mental models

Before hitting the solution button, understanding the users’ mental model is of paramount importance. The user, being the biggest asset in UX, needs to be at the forefront of the design to get the most of the product.

I knew right from the start that developing anything would require a lot of patience, a mature understanding of what the community as a whole goes through, and giving them the confidence of “Yes, I will be the one to connect you to technology”. To understand the users, I went back to my home town and reached out to a blind school. They were extremely generous and allowed me to conduct an interview at the school premises. The participants? the students. Prior to that, I started gathering information on some of the existing products available in the market and I have a fair idea as to what the market space looks like.


I ran through numerous websites and reached out to people who have used a device or application(s). Below mentioned are the different products that have been analyzed for their strengths and weaknesses.

Left side: Applications.
Right side: Hardware devices.


I have always believed that I can make people feel comfortable to talk in any given circumstance. Before the interviews were conducted, my professor and I ran through the list of questions that need to be asked after days of rigorous discussions because we did not want in any way hurt the participants during the course of the interview.

Should we focus on just navigation outside the place of stay or should we consider indoors as well?

Do we ask about the technologies they are familiar with? Mobile Applications considered too?

Will the questions be intrusive? …. We should not hurt them in any way.

I used qualitative codes to obtain high-level themes and draw effective conclusions from the interview data. The overarching theme was related to the difficulty of using technology in conjunction with the white cane ; In a country like India, where environmental noise overshadows the sound made by the phone. This is a situation that they want to avoid.


Among the other things, they want to see applications that give them the presence of sidewalks, the positioning of intersections, an emergency assistance button, and scanner of handwritten data. Additionally, several application-specific pointers were noted down and eventually referenced to a User Persona to get an idea of how the application will be shaped in the future.

Persona of Natasha


The Affinity diagram helped me in the process of mapping ideas and thoughts. I went over to the interviews and picked out one of the participants and created an affinity diagram for that user. The Affinity diagram helped me structure my thoughts and organize the content. I used the steps mentioned on this website to create them. I created similar Affinity Diagrams for other participants and grouped all the similar ones into a common pool.

Affinity diagram for a participant.


Storyboards changed my perception of the functionalities that can be implemented in my project. I realized that my application cannot have all the functionalities that I desire because of constraints indicated by the storyboards. Having a story and a script made me realize that a number of things can be added/deleted with respect to specific scenarios.

Not everything has to be pretty.


CARDS? Well, not really. More like, my mind version of the mobile app on a more or less mobile screen size lookalike.THE LOW FIDELITY SKETCHES

I love to put my thoughts on paper so I decided to go ahead with the pen-paper route. Any solution that comes to mind has to hit the white sheet first.

Scaled images of the application under consideration.


This process exists on all the design solutions, doesn’t it? I used Balsamiq to generate the wireframes of the skeletal structure of the application. Balsamiq Mockups is a great tool with a low learnability curve and relative ease of use.

First pages.

I took inspiration from the application “Be my eyes”, offering a different UI for sighted and the blind.

The need to obey Fitts Law is of paramount importance here as people who use screen readers require ease of access in terms of scrolling and button press.


This page is the key component in the application and has 4 pivotal functions:

  • Ask for Assistance
  • Google Maps
  • Video Assistance
  • Connect to devices (Explicitly)

Function Screens
normal assistance

For the usage of maps, I made use of a simple design involving existing features of talking to the voice assistant as well as a search bar to aid a user expert in typing.

video assistance
Connecting to nearby devices (audio output mainly)

Some of the users of the application may be drawn towards using a Bluetooth headset to aid them while navigating. This, however, plays against them because it cancels the noise of the outside environment. An added functionality that can be leveraged at any time, should the user need it.


APP-dC is an application designed in such a way that the users are at the forefront of everything. They have their own section where they can choose the various tools at their disposal. This would include the editing of information, how they want their notifications, connection to Amazon Alexa, editing their preferences, adjusting their camera settings, and accessing the general settings.


I had to explore tools that give me voice output as part of the working prototype. I went ahead with Adobe XD as it is a good tool for voice prototyping. A lot of questions ran through my mind.

  • Unfamiliar with voice play-back.
  • Confirmation play-back is unknown.
  • Accuracy of the voice option.
  • Will the prototype be played on the web browser?

A note – Please use the swiping mechanism to proceed to further screens upon hearing the voice note. However, voice triggers are not supported within embedded prototypes, thus making it ineffective to play it below. Kindly use the mobile XD application for the best results. https://xd.adobe.com/view/0eddc7ae-8dc8-432b-698f-fc93b5b88bb9-78ec/


From trying to empathize with the blind community to putting all my might behind making a prototype that is accessible to them, I have come a long way. I feel that the project has made me a stronger person mentally. I was skeptical at the beginning about the project itself, but as time progressed, with belief; the belief instilled in me by the community.

Along the way, I faced innumerable challenges. Everyone I spoke to had an opinion and a design idea. I went with my instincts, tried being in the shoes of a blind person. This helped me understand and develop a novel idea for the application. Connecting people on the app is similar to what happens in rideshare, and I hope that one day, even if my prototype does not hit the market, it stands as an inspiration for many designs to come.