[PDF]

AI-powered Sign Language Application - Signify


Chun Ting Justin Lo

09/10/2023

Supervised by Fernando Alva Manchego; Moderated by Steven Schockaert

Proposed App Name: - Signify

Application Aims: - The sign language application allows users to easily create and manage their own custom sign language translation models using AI. Most existing sign language applications that are mainly intended for translation and educational purposes only support widely recognized sign languages like ASL, leaving many highly localized and specialized sign language terms unsupported. Such as sign language terms for a local community and for the electrical engineering industry. - The app enables users to record, store, and train their own sign language models using video recordings, which they can also share with others. With the users-trained model, the app supports translation and instruction of their sign language, which empowers users to communicate more effectively within minority sign language communities, while reducing communication barriers between members of these communities and those outside of them.

Major Features: 1. Create personalized sign language translation models within the app. 2. Train models by recording 10-20 videos for each sign, with corresponding meaning text. 3. Use real-time webcam motion detection to recognize signs from user-defined or shared models. 4. Construct sentences using translated sign language terms. 5. Add new sign language terms to your model with 10-20 recordings and tags. 6. Remove trained sign language terms from your model. 7. Contribute to the collaboration sign models training of the others

Machine Learning Techniques: - Convolution Neural Networks (CNN): - Extracting hand gestures and human skeletons from frames - Recurrent Neural Networks (RNN) : - Long Short-Term Memory (LSTM) network - Handle the sequential nature of the hand gesture data extracted - Transfer/Incremental Learning - Append new sign language term with videos and meaning tag - Transfer Learning - sign that is related to an existing sign - Incremental Learning - sign that is not directly related to an existing sign (Fine-Tuning?) - Natural Language Processing: - Sentence formation from translated sign language terms (n-gram language modeling?) - Language modeling to predict words in sequence - Sentiment detection for term categorization suggestion

Application Techniques: - ReactTS, NestJS, Django, MySQL, Redis, Nginx

User group example: - IT technical sign language recording - Different deaf schools - Local community

Existing Sign Language App: - SLAIT (ASL) (https://www.youtube.com/watch?v=PcZbajccCdU) - HandsTalk (ASL, BSL) (https://www.handtalk.me/en/app/) - Google AI (Hand Talk) (https://www.youtube.com/watch?v=N0Vm0LXmcU4)

Reference: Isolated nature of sign language - https://www.british-sign.co.uk/what-is-british-sign-language/ Sign Language Detection using Action Recognition with Python (LSTM deep learning) - https://www.youtube.com/watch?v=doDUihpj6ro


Final Report (09/10/2023) [Zip Archive]

Publication Form