Recognition of hand gestures in 3D space using a single low resolution camera for converting American Sign Language into any spoken language.
The requirements.txt
file should list all Python libraries that your notebooks
depend on, and they will be installed using:
pip install -r requirements.txt
To run the web application:
python app.py -i 0.0.0.0 -o 8080
It will run your app on http://localhost:8080/
There are only 250 certified sign language interpreters in India, translating for a deaf population of between 1.8 million and 7 million.
We need to devise a solution that allows inclusion of the deaf and mute people in normal conversations. Our application allows any user to point the camera towards a mute person (with consent, ofcourse) and effectively understand what he/she's trying to say.
American Sign Language (ASL) is a visual language. With signing, the brain processes linguistic information through the eyes. The shape, placement, and movement of the hands, as well as facial expressions and body movements, all play important parts in conveying information.
Sign Language MNIST (https://www.kaggle.com/datamunge/sign-language-mnist) Each training and test case represents a label (0-25) as a one-to-one map for each alphabetic letter A-Z (and no cases for 9=J or 25=Z because of gesture motions).
Autocompletion and Word Suggestion simplify and accelerate the process of information transmission. The user can select one out of the top 4 suggestions or keep making more gestures until the desired word is obtained.
- Deaf people can have a common classroom by asking their questions/doubts without any hesitation
- Inclusion of this community in normal schools.
- Tourist Guides can communicate better using sign language