[PDF]

Machine Learning: Translating sign language in real time.


Oliver Storey-Young

07/10/2021

Supervised by Michael Daley; Moderated by Richard Booth

As the ability to access technology continues to grow, we have seen a large increase in the demand for video calls. Allowing people to connect at a more personal level than a phone call, video calls allow family, friends, and employers to communicate worldwide, face to face, over the internet. With the pandemic came an increased demand for this and similar services, as people were unable to visit each other in person.

Statistics collected suggest that in 2011 around 151,000 UK residents used British Sign Language(BSL)\cite{BDA}. This project aims to create a software product that demonstrates the ability for accessibility in the above or similar scenario for those BSL users. The product will allow for users to receive an accurate and fast translation of their signs, which could be forwarded on in a live conversation.


Final Report (07/10/2021) [Zip Archive]

Publication Form