mySignInterpreter: Giving a Voice to the Deaf and Mute
Today in the UK, 1 in every 1000th child in is born with severe hearing loss and approximately 50,000 citizens have British Sign Language (BSL) as their first or preferred language. Wouldn’t it be amazing if a LabVIEW program could make life easier for these deaf and mute people to communicate with people who dont understand sign-language? Using the new Analytics and Machine Learning toolkit and a low-price infrared sensor, it can!
Starting with a simple neural network that could distinguish between an open and closed hand, my machine learning algorithm can now identify around 20 hand signs. My LabVIEW program displays the user’s hands in real time on the screen and matches hand signs to a letter with the press of a button. I have been able to confidently interpret the stationary letters in the BSL alphabet - all letters except for “H” and “J”, where the hands are moving.
Of course, a project like this is not very relevant unless it’s developed with input from the intended users! With this in mind, I put an article in the local newspaper asking for volunteers to test my prototype. I have been contacted by a bunch of lovely people who were willing to help out. Based on their feedback I am now working on some improvements that will make my program easier to use. The article also caught the attention of the BBC, who broadcast a news report on my project. This was aired on BBC One for the Lunch and Evening news - you can see a compilation of the videos here:
Thanks to my colleague Chris Hyde for the video editing and for creating the banner above.
Being able to work on a project I am passionate about has been incredibly rewarding, and I hope that the media coverage will increase awareness of the deaf and mute members of our society.
The Hardware
LEAP Motion Sensor: It uses infrared radiation together with an anatomical model for the human hand. The LEAP connects to a computer or tablet using a USB cable and can be bought here.
The Software
My sign language interpreter would never have been possible without the newly released Analytics and Machine Learning Toolkit. A classical algorithm could only hope to interpret a hand sign if every point of the hand was in exactly the same position every time. This is of course not a realistic expectation to have of a human user. A machine learning algorithm has given my code the flexibility to disregard irrelevant data - so a hand sign is interpreted just the same even if it is shifted or rotated with regards to the sensor. The AML Toolkit is clearly a very powerful tool that enables the development of groundbreaking LabVIEW programs. What's more, it's surprisingly quick to get started with!
How to use the code
To this post is attached the current version of my program, able to learn and interpret stationary hand signs. Below you can see a screenshot of the code in action. The user's hand or hands (maximum of two) is displayed in real time as a line animation. The program matches the current hand sign with known sign language letters, and the interpreted message is written to a text at the bottom left. From Hand Count we can see that one hand is currently detected.
Below is an example of how data is read from the LEAP sensor. The LEAP will give data in Frames, with each frame containing a certain number of hands. Each hand will have fingers, which have individual bones, and so on. In my LabVIEW program I choose which information I need and save this to one big cluster.
How to use the sign language interpretation functionality:
How to train the algorithm:
Future Improvements
About the Developer
My name is Sara and at the time of making this project I was spending a year in industry with National Instruments in the UK. I am studying a Masters in Engineering Physics at Chalmers University of Technology in Gothenburg, Sweden. Feel free to get in touch if you want to know more about my project or if you have any suggestions! You can reach me through my LinkedIn.
My Masters has a focus on Machine Learning and I'll continue improving this code as I learn more. The most recent version of my code will be available on my GitHub.