Student Projects

cancel
Showing results for 
Search instead for 
Did you mean: 

mySignInterpreter: Giving a Voice to Sign Language Users

mySignInterpreter: Giving a Voice to the Deaf and Mute

Today in the UK, 1 in every 1000th child in is born with severe hearing loss and approximately 50,000 citizens have British Sign Language (BSL) as their first or preferred language. Wouldn’t it be amazing if a LabVIEW program could make life easier for these deaf and mute people to communicate with people who dont understand sign-language? Using the new Analytics and Machine Learning toolkit and a low-price infrared sensor, it can!

 

FinalBanner.jpg

 

Starting with a simple neural network that could distinguish between an open and closed hand, my machine learning algorithm can now identify around 20 hand signs. My LabVIEW program displays the user’s hands in real time on the screen and matches hand signs to a letter with the press of a button. I have been able to confidently interpret the stationary letters in the BSL alphabet - all letters except for “H” and “J”, where the hands are moving.

 

Of course, a project like this is not very relevant unless it’s developed with input from the intended users! With this in mind, I put an article in the local newspaper asking for volunteers to test my prototype. I have been contacted by a bunch of lovely people who were willing to help out. Based on their feedback I am now working on some improvements that will make my program easier to use. The article also caught the attention of the BBC, who broadcast a news report on my project. This was aired on BBC One for the Lunch and Evening news - you can see a compilation of the videos here:

 

Thanks to my colleague Chris Hyde for the video editing and for creating the banner above. 

 

Being able to work on a project I am passionate about has been incredibly rewarding, and I hope that the media coverage will increase awareness of the deaf and mute members of our society.

The Hardware

LEAP Motion Sensor: It uses infrared radiation together with an anatomical model for the human hand. The LEAP connects to a computer or tablet using a USB cable and can be bought here.

The Software

My sign language interpreter would never have been possible without the newly released Analytics and Machine Learning Toolkit. A classical algorithm could only hope to interpret a hand sign if every point of the hand was in exactly the same position every time. This is of course not a realistic expectation to have of a human user. A machine learning algorithm has given my code the flexibility to disregard irrelevant data - so a hand sign is interpreted just the same even if it is shifted or rotated with regards to the sensor. The AML Toolkit is clearly a very powerful tool that enables the development of groundbreaking LabVIEW programs. What's more, it's surprisingly quick to get started with! 

How to use the code

To this post is attached the current version of my program, able to learn and interpret stationary hand signs. Below you can see a screenshot of the code in action. The user's hand or hands (maximum of two) is displayed in real time as a line animation. The program matches the current hand sign with known sign language letters, and the interpreted message is written to a text at the bottom left. From Hand Count we can see that one hand is currently detected.

 

MainFP.PNG

 

Below is an example of how data is read from the LEAP sensor. The LEAP will give data in Frames, with each frame containing a certain number of hands. Each hand will have fingers, which have individual bones, and so on. In my LabVIEW program I choose which information I need and save this to one big cluster.
ReadHand.PNG

How to use the sign language interpretation functionality:

  1. Install the LEAP motion drivers linked above.
  2. Download and extract the mySignInterpreter.zip folder attached to this post.
  3. The program loads with a machine learning model that can distinguish between “A”, “B”, “C” and “D” in American Sign language. Note that this is for your ease of use - ASL letters are represented by one handed hand signs, which leaves one hand free for clicking! The sane neural network can also be configured for two handed signs. 
  4. Make sure the LEAP sensor is plugged in and detected.
  5. Double click the .lvproj file, then open and run the Main.vi.
  6. Your hand will appear as a line animation on the screen (hold your hand at least 15 cm above the sensor).
  7. Display your right hand to the sensor, palm down, in one of these signs: ABCD.PNG
  8. Click Interpret when ready, and in Interpreted Text you will see a match with the closest available hand sign.

How to train the algorithm:

  1. Follow steps 1-5 as described above.
  2. Open the Train Machine Learning Model tab.
  3. Click Write Header to automatically generate a header for the training data file.
  4. Choose a hand sign to train using the Hand Sign Enum.
  5. Display your hand above the sensor, and check Hand Count to make sure it is detected correctly.
  6. When ready, click Start Logging. The program will wait for a few seconds, then automatically record several snapshots of the hand sign. When Points Left to Log reaches 0, you can move your hand again.
  7. Repeat step 4-6 for as many hand signs as you like. The program will require you to train at least 2 different hand signs.
  8. When done, click Train Model. A new machine learning model will be created, save this with the file ending .json. 
  9. The parameters from your logged training data will be displayed in the Feature Trend graph.
  10. Navigate to the Test Program tab and choose the new .json file. 

Future Improvements

  • Moving hand signs: My challenge at the present is to enable interpretation of gestures. To interpret “H”, “J” and most words, my program will need to continuously track the hands to match movements against known gestures.
  • Once the program can identify gestures I will let it run the interpretation algorithm without a need for the “Interpret” button.

About the Developer

My name is Sara and at the time of making this project I was spending a year in industry with National Instruments in the UK. I am studying a Masters in Engineering Physics at Chalmers University of Technology in Gothenburg, Sweden. Feel free to get in touch if you want to know more about my project or if you have any suggestions! You can reach me through my LinkedIn.

 

My Masters has a focus on Machine Learning and I'll continue improving this code as I learn more. The most recent version of my code will be available on my GitHub.