For those of you who don’t know, VTalk is a mobile app that I developed to facilitate communication between Deaf-Mute people and speaking-hearing people. This article is about the story of how Vtalk was developed, the challenges I faced and the super awesome journey till now. Before that, let me just explain how the app works.
The Deaf person (we use the word Deaf for people who are Deaf-Mute, since the absence of hearing means that they are not very comfortable speaking either) holds his/her phone with Vtalk open, and when the speaking-hearing person says something, the Deaf person can see it as a message on his/her screen. To respond, the Deaf person types his/her response, and when he/she hits send, the device speaks out those words for him.
As far as the User Interface (UI) is concerned, it looks exactly like a messaging app, but the purpose here is completely different. The two people involved in the conversation are standing face-to-face. This is how it looks:
Originally, I did not set out to create any such app. I was just playing around with the speech recognition and synthesis libraries in React Native to see what I could do with them. One day, my parents told me that they came across a Deaf person working at a cash counter in a store. I listened to them tell me how he asked for their phone number by handing them a calculator and gesturing the word “phone” with his hand.
That was when I thought I could do something for the Deaf by making a mobile app using these libraries.
I set out to develop this, and my first version did exactly what I’ve mentioned till now, plus it allowed a user to save conversations. For those of you interested in the technical side of things, I used an SQLite database on the mobile itself to store these conversations.
Before I went ahead with more development, I wanted to see how this worked. I went to the store that my parents had mentioned, and tried to get in touch with one of the Deaf people there. However, my first attempt was not very successful, since none of them were on duty that time. Another place I visited in an attempt to test my app was a Deaf school near my house, and I requested the teachers to let me interact with the students.
The teachers were very helpful, and were happy to facilitate my trials. We decided on a time, and a teacher there called some students for my demo. I had asked him to selectively take students who were accustomed to using smartphones, and he did the same. However, we later realized that not all students there were pursuing their studies in English, and that was the only language my app supported!
Luckily, I was able to get a student who knew English and was also tech savvy. After the teacher introduced the concept of the app to him using sign language, I was able to have an amazing conversation with him- we talked about topics ranging from his school and studies to cricket.
My takeaway from that test was that VTalk needed to support languages. Especially if I wanted people in India to try it out, I needed to support Hindi and as many regional languages as possible. And that’s one of the features that I developed immediately after that. VTalk now supports all languages for which the device contains a voice.
Another thing that I realized I needed, was a visual indicator for when the app was listening to speech from the speaking-hearing person. Although there was a beep that indicated that it had started listening, there had to be a visual indicator for the Deaf person holding the phone.
After this, I was able to get a meeting with members from Ability Foundation, Chennai- an organization that works with Deaf people. They were impressed with the speed and accuracy of the app, and how smoothly it enabled a dialogue. The meeting itself was in July 2019, and they told me to attend an event called EmployAbility in December of the same year for some trial runs.
Super excited about these trials, I added several features to the app to make it more suited to an interview setting. Users could now enter a code given to them by Ability Foundation, which took them to a special chat screen in the app. These interview chats were now saved on a server, and Ability Foundation could use an admin login to view interview conversation. This would enable them to find out how the app performed, and also give them and me a record of the interview conversations done through the app.
With all of this ready, I was demonstrating the app to a friend, and after he had taken a look, he immediately pointed out, “The phone’s mic is at the bottom, which means that it is pointing towards the person holding the phone! If the person standing in front is the speaker, why not point the mic towards him?”
This was an interesting point, and I set it right by giving the app a reverse-portrait orientation. This means that whenever you’re using this app, you need to be holding your phone upside-down.
With an improved app, I ran several trials at the employment fair in Chennai in December 2019. I was able to have some good conversations through the app, and gain valuable feedback. A highlight of my visit there was that one of the people I interacted with was so impressed, that he handed me his phone and requested that I install the app for him! My app wasn’t store-ready at the time, but I gave him a pre-release, and he was delighted.
Immediately after my trial runs in Chennai, I had a presentation to give in Nagpur. I had gotten in touch with some ENT surgeons, since they frequently deal with patients with hearing impairments and trouble with speaking. One of them told me that I should let all ENT surgeons know that such an app existed, so that they might be able to recommend it to their patients. He was happy to give me a presentation slot at AOICON, the national conference of ENT surgeons.
I also wanted to put the app on the Google Play Store before my presentation, so that I could tell the doctors that their patients could download it from there. So I spent my winter break preparing the presentation as well as making the app store-ready.
One key change to make was the user interface, the looks of the app. Luckily, my sister was in Nagpur in December, and she works as a UX designer in Michigan. With her help, I refined the looks of the app, and got it ready for the store in time for my presentation.
My presentation went fantastic too! Although I’ve always had stage fright, I was able to deliver it well, and among all the doctors present there, I was the only non-doctor to get the honor of speaking at the conference!
One of the speakers who spoke right before me said to my parents later, “The kid stole the spotlight!” That was one of the most delightful things I’d ever heard! Another huge joke from the conference was that every time the anchor called out my name, he gave me the title of doctor! 😛
Not just that, but even my certificate from the conference has “Dr Shreyas Nisal” written on it!
At the conference, one of the ENT surgeons told me about how there were surgeries that affected people’s voice boxes, and that the surgeries result in a loss of voice for the patients! He told me that it would be pretty cool if I could have a patient record his voice before surgery, and then have VTalk synthesize his voice itself. That’s one of the toughest tasks that I have listed with me, but I really hope that I’m able to do that someday!
Right now, there are several feature that I plan on adding. Some of them involve using natural language processing to make the app intelligent. Other than making improvements in the app, I also want to learn the sign language, so that I can communicate better with Deaf people, explain the app to them and encourage them to use it!
It’s been a beautiful journey so far, and it could not have been possible without so many people helping me out. So a big, big thank you to everyone who played a part in getting this app to where it is now. And I hope I am able to improve this further, and develop an app that truly makes a huge difference in the lives of Deaf people around the globe!
You can also take a look at the app on the Google Store if you have an android device. Do drop a like if you had fun reading about my journey, and share the article with your friends and the app with people who you think might benefit from it!