Article 5H354 SLAIT’s real-time sign language translation promises more accessible online communication

SLAIT’s real-time sign language translation promises more accessible online communication

by
Devin Coldewey
from Crunch Hype on (#5H354)

Sign language is used by millions of people around the world, but unlike Spanish, Mandarin or even Latin, there's no automatic translation available for those who can't use it. SLAIT claims the first such tool available for general use, which can translate around 200 words and simple sentences to start - using nothing but an ordinary computer and webcam.

People with hearing impairments, or other conditions that make vocal speech difficult, number in the hundreds of millions, rely on the same common tech tools as the hearing population. But while emails and text chat are useful and of course very common now, they aren't a replacement for face-to-face communication, and unfortunately there's no easy way for signing to be turned into written or spoken words, so this remains a significant barrier.

We've seen attempts at automatic sign language (usually American/ASL) translation for years and years. In 2012 Microsoft awarded its Imagine Cup to a student team that tracked hand movements with gloves; in 2018 I wrote about SignAll, which has been working on a sign language translation booth using multiple cameras to give 3D positioning; and in 2019 I noted that a new hand-tracking algorithm called MediaPipe, from Google's AI labs, could lead to advances in sign detection. Turns out that's more or less exactly what happened.

SLAIT is a startup built out of research done at the Aachen University of Applied Sciences in Germany, where co-founder Antonio Domenech built a small ASL recognition engine using MediaPipe and custom neural networks. Having proved the basic notion, Domenech was joined by co-founders Evgeny Fomin and William Vicars to start the company; they then moved on to building a system that could recognize first 100, and now 200 individual ASL gestures and some simple sentences. The translation occurs offline, and in near real time on any relatively recent phone or computer.

slait-trans-asl-crop.gif

Image Credits: SLAIT

They plan to make it available for educational and development work, expanding their dataset so they can improve the model before attempting any more significant consumer applications.

Of course, the development of the current model was not at all simple, though it was achieved in remarkably little time by a small team. MediaPipe offered an effective, open-source method for tracking hand and finger positions, sure, but the crucial component for any strong machine learning model is data, in this case video data (since it would be interpreting video) of ASL in use - and there simply isn't a lot of that available.

As they recently explained in a presentation for the DeafIT conference, the first team evaluated using an older Microsoft database, but found that a newer Australian academic database had more and better quality data, allowing for the creation of a model that is 92% accurate at identifying any of 200 signs in real time. They have augmented this with sign language videos from social media (with permission, of course) and government speeches that have sign language interpreters - but they still need more.

Deaf-Understand-Hearing-1-www.slait_.ai_

A GIF showing one of the prototypes in action - the consumer product won't have a wireframe, obviously. Image Credits: SLAIT

Their intention is to make the platform available to the deaf and ASL learner communities, who hopefully won't mind their use of the system being turned to its improvement.

And naturally it could prove an invaluable tool in its present state, since the company's translation model, even as a work in progress, is still potentially transformative for many people. With the amount of video calls going on these days and likely for the rest of eternity, accessibility is being left behind - only some platforms offer automatic captioning, transcription, summaries, and certainly none recognize sign language. But with SLAIT's tool people could sign normally and participate in a video call naturally rather than using the neglected chat function.

In the short term, we've proven that 200 word models are accessible and our results are getting better every day," said SLAIT's Evgeny Fomin. In the medium term, we plan to release a consumer facing app to track sign language. However, there is a lot of work to do to reach a comprehensive library of all sign language gestures. We are committed to making this future state a reality. Our mission is to radically improve accessibility for the Deaf and hard-of-hearing communities."

slait-team.jpg

From left, Evgeny Fomin, Antonio Domenech and Bill Vicars. Image Credits: SLAIT

He cautioned that it will not be totally complete - just as translation and transcription in or to any language is only an approximation, the point is to provide practical results for millions of people, and a few hundred words goes a long way toward doing so. As data pours in, new words can be added to the vocabulary, and new multigesture phrases as well, and performance for the core set will improve.

Right now the company is seeking initial funding to get its prototype out and grow the team beyond the founding crew. Fomin said they have received some interest but want to make sure they connect with an investor who really understands the plan and vision.

When the engine itself has been built up to be more reliable by the addition of more data and the refining of the machine learning models, the team will look into further development and integration of the app with other products and services. For now the product is more of a proof of concept, but what a proof it is - with a bit more work SLAIT will have leapfrogged the industry and provided something that deaf and hearing people both have been wanting for decades.

5 emerging use cases for productivity infrastructure in 2021

Techcrunch?d=2mJPEYqXBVI Techcrunch?d=7Q72WNTAKBA Techcrunch?d=yIl2AUoC8zA Techcrunch?i=dK5paqGGVOA:50hSTxWGTtg:-BT Techcrunch?i=dK5paqGGVOA:50hSTxWGTtg:D7D Techcrunch?d=qj6IDK7rITsdK5paqGGVOA
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TechCrunch/
Feed Title Crunch Hype
Feed Link https://techncruncher.blogspot.com/
Reply 0 comments