Indian Sign Language Interpreter
- yogeshpawar1828
- Feb 22, 2020
- 1 min read
As part of my PGP in Data Science, Business Analytics and Big Data from Aegis school of Data Science, submitted a Capstone project. Developed a tool that will help communicate a deaf and Mute person with normal person.
This project is about recognizing Indian Sign Language (ISL) gestures using video camera and converting it to text and speech. This tool will enable a normal person to effectively interact with any deaf and mute person. An extensive approach is being use for classification of ISL gesture which imparts an elegant way of interaction between normal person using some hardware and human being.
Currently, someone who is deaf can only access basic text chat to communicate online. This interface is limiting, and further, in the group-call scenario, someone who can only communicate using sign language may feel left out of the group conversation. Providing first-class support for sign language in video calls would empower every individual to have a more natural and meaningful video chat experience.
Ideal solution is that an application can translate any natural sign language to text in real-time over video.
Companies like MotionSavvy have done impressive work in the space, but in general, solutions to this problem are in their infancy.
The ISL gesture dataset received from IIT Allahabad and is composed of sequences of RGB frames for 23 Isolated ISL gestures
Attached video show the demo of 4 gestures that I enacted and the model successfully interpreted those actions.
Comments