top of page
Search
gowolade

Sign2Word App

Updated: Mar 3, 2020

The seeds of our idea for an AI App for Deaf children started way back at our very first Young Coders MeetUp (YCM), in January 2019. Imran joined this event, our first deaf young coder who has been such an inspiration to us all. Our meetups are youth-led and focus on Diversity, inclusion and community. After a period of Agile development we decided that we want to explore Al and Machine Learning to celebrate the first MeetUp. In the spirit of collaboration, Imran and Femi, a founding member, buddied up together to engage in a workshop on Machine Learning for Kid using IBM Watson. As the afternoon progressed Imran told Femi about the difficulties that deaf people have with their literacy skills, so whilst they were progressing through the sesion they came to the conclusion that, perhaps with machine learning they could help other deaf students with this problem. So, making the most of the limited features, they built a quick prototype using fingerspelling signs - capturing still images through their laptops and going through the supervised learning of their model. However, they found that this was quite hard as some of the signs required movements, they were not just static signs.


After meeting up a few times, talking about their ideas to other members of the YCM, they were encouraged to take their idea to the iOS community, and were invited by Skillsmatter to present their ideas at the iOS Con 2019 in March. There they got some great advice from industry professionals on how they could use Swift to create the app on the iPhone and iPad. They managed to make some awesome contacts in the Swift community. Tim Condon, who had worked on the BBC iPlayer app and runs Swift workshops in the industry, was so impressed that he offered to help and later ran a two day workshop for the YCM, at the Tate Modern. Although they learned a lot about the uses of Swift and why it was so awesome for machine learning, they weren’t able to get much support in creating the machine learning side of the app.


However, it was great because a team of young coders got interested in the project. Over the first 6 months the YCM ran workshops on User Stories, building narratives, collaborative working and designing websites. This enabled the group to come together when they found out about the Longitude Explorers Prize.


Femi was lucky enough to be invited by UAL Creative Computing Institute in July 2019, to be a guest participant on their Masterclass on Machine Learning for the Creative Industry. There, he learned all about neural networks and machine learning models. He was given some great advice from the facilitator, Marco Marchesi about the project idea. He was pointed towards Rebecca Firebrink and her amazing Wekinator project as well as being guided towards Dynamic Time Warping as a way to solve the problem of detecting moving signs. Femi was then able to explore Wekinator and dive deeper into machine learning throughout the summer, and he got a chance to contact Rebecca Firebrink for help with the idea. With her Wekinator project he made a very simple prototype of the app, able to determine a few distinguished signs from each other. In the meantime Imran helped to collect video clips of people signing individual words for data, to use for training a neural network.


In November Femi went to a talk by Professor Richard Harvey from Gresham University, an expert on artificial intelligence, machine learning and signal processing, and asked him about the approach that they should be taking towards the idea. He told Femi that Dynamic Time Warping is fairly experimental and not nuanced enough and advised that Hidden Markov Models would be a better way to go. He also said that this sort of project has been researched before and that the expert on computer vision, machine learning and Sign Language Recognition would be Professor Richard Bowden, University of Surrey.


Inspired by the Nesta Longitude Explorers Prize competition Femi and Imran invited other YCM coders to form a team to enter the Sign2Word AI app for the competition. What has been fantastic is that the YCM’s focus on collaborative working and knowledge sharing, has meant that an awesome team, each young person with unique strengths, knowledge and skills have joined forces and it’s been amazing. They have all been to the YCMs so have a shared experience and enjoy being together. Since the news came that they had got through to the Semi-finals, they have been meeting regularly, and after brushing up on his machine learning and AI skills, Femi delivered a knowledge sharing session to the rest of the group to get everyone on a level playing field. Imran is too old to join the competition but he still holds a central role of being the Product Client and representing the Deaf community. Mutsa has been amazing with her design skills - interpreting the user stories from Imran and translating them into wireframes so the Swift App can be designed to meet the needs of the users. Malaika has been collaging all the information and is building a website and managing the content. Nishka has been documenting the journey and giving support with the learning and looking into the mathematics linked to the project (Statistics & Linear Algebra). Thomas has been exploring the options for the build - so we can incorporate the neural network into the Swift app in the most efficient way.


We are currently at the phase of looking at research in the field of Computer Vision and understanding the complexities of Sign Language Recognition. Having made the initial contact with Academia and continuing to learn more about Machine Learning, our next stage is to follow up, seek mentoring and learn more about Hidden makov models. We are really excited about collaborating together, being on a journey of discovery to make our Sign2Word AI App a reality - providing the very thing that Imran found missing as a young Deaf person as he struggled with written language.

18 views0 comments

Recent Posts

See All

YCM 5

Comments


Post: Blog2_Post
bottom of page