Home > Projects > ASL Tutor

ASL Tutor

Featured on the Leap Motion blog

Idea

Deaf people have significant things to offer to society, yet they are often left out of critical dialogues because of communication barriers.

ASL tutor is Rosetta Stone for sign language; it helps people learn American Sign Language in a fun, easy way. We show users an image of a particular sign and use skeletal tracking combined with machine learning to detect the physical location/orientation of your hand and determine what sign you are making.

Matt Tinsley and I built this in under 24 hours at TAMUHack 2015. Source code is available on GitHub.

Tech used

  1. Leap Motion -- used for the skeletal tracking. We were able to get sixty parameters of data about the human hand out of the SDK, which was extremely useful.
  2. scikit-learn -- used for the machine learning algorithms. We analyzed roughly ten different classification algorithms before settling on a support vector classifier with parameters tuned to our specific data set.
  3. Python/Flask -- used for the front end. We feel comfortable with web technologies, so we decided to use them to build the front end of our application.
  4. SQLAlchemy/Redis -- used for storing training data for the machine learning algorithms and for storing the scoreboard.

I wrote more about the process of building it in my writeup of TAMUHack 2015.

Future improvements

Currently, ASL Tutor is only able to recognize the alphabet, but we'd like to improve it to be able to recognize any sign.

The "game" aspect of it was mostly devised as a fun way to demo our sign language recognition at a hackathon. Turns out that it's legitimately fun, but I think there are other applications of this technology as well. For example, it would be cool to build an on-the-fly translator for ASL.