LIVE Radio

Current track





Written by on May 7, 2024

A fourth-year computer science student specializing in data science at the Vellore Institute of Technology, went viral on LinkedIn after using AI to translate American sign language (ASL).

Priyanjali Gupta got the idea from her mom, who pushed her to put her engineering degree to good use.

She made the model using Tensorflow Object Detection to translate a few American Sign Language (ASL) signs to English using transfer learning from a pre-trained model created by Nicholas Renotte.

Gupta credits Renotte’s YouTube tutorial for encouraging her to use the model on ASL.

“To build a deep learning model solely for sign detection is a really hard problem but not impossible,” Gupta wrote in a LinkedIn comment.

According to the World Federation of the Deaf, more than 300 sign languages are spoken by more than 70 million deaf people worldwide. In the US, ASL is the third most commonly used language after English and Spanish.

researchers in Ireland estimate that 70% of sign language comes from facial expressions – not to mention body movements. Capturing this aspect of sign language would require computer vison. Also, software needs to be able to translate complete sentences as well as individual signs.

While AI researchers tackle these challenges, Deaf people continue to face discrimination with regard to sign language use.

As Gupta shares, “I think the first step would be to normalize sign languages and other modes of communication with the specially-abled and work on bridging the communication gap.”


Reader's opinions

Leave a Reply

Your email address will not be published. Required fields are marked *