Google has AI that reads sign language aloud

ad1

San Francisco-based tech giant, Google, has developed an algorithm for artificial intelligence (AI) that allows a smartphone to interpret sign language and read its meaning out loud, helping people with hearing disability to integrate into the society of hearing people more easily.

In a blog post written by Google research engineers Valentin Bazarevsky and Fan Zhang, the tech supergiant said that with the partnership of the image company, MediaPipe, they are proud to provide developers with an open-source algorithm that they have developed in hopes of “providing this hand perception functionality to the wider research and development community will result in an emergence of creative use cases, stimulating new applications and new research avenues.”

“Today we are announcing the release of a new approach to hand perception, which we previewed CVPR 2019 in June, implemented in MediaPipe—an open-source, cross-platform framework for building pipelines to process perceptual data of different modalities, such as video and audio. This approach provides high-fidelity hand and finger tracking by employing machine learning (ML) to infer 21 3D keypoints of a hand from just a single frame,” the researchers said in the blog post.

While Google researchers have developed an open-source algorithm, which means that anyone can use the codes for their own developments, they will not build their own application for it. Instead, they are hoping that developers will continue the research and develop services using their algorithm to help the community of hearing-deprived individuals.

“The ability to perceive the shape and motion of hands can be a vital component in improving the user experience across a variety of technological domains and platforms. For example, it can form the basis for sign language understanding and hand gesture control, and can also enable the overlay of digital content and information on top of the physical world in augmented reality,” the researchers explained.

Google, however, was straightforward in saying that this is just the first step as the developed algorithms have many incapabilities as of its current phase.

“We’re excited to see what people come up with. For our part, we will continue our research to make the technology more robust and to stabilize tracking, increasing the number of gestures we can reliably detect,” a spokeswoman for Google said in an interview.

With this limitation, inclusivity advocates, while praising Google for its initiative, agreed that there need to be more capabilities that need to be incorporated in the published algorithms. Right now, the algorithm could potentially miss facial expression as it relies on hand movements when interpreting sign language. Changes in facial expression also change the meaning of the sign language. Furthermore, the current algorithm is also exclusive for standard sign language and does not take into account nuances from different cultures and localities.

Jesal Vishnuram, Action on Hearing Loss’s technology manage, praised Google for the initiative but also suggested that other capabilities need to be added for it to be fully inclusive.

“From a deaf person’s perspective, it’d be more beneficial for software to be developed which could provide automated translations of text or audio into British Sign Language (BSL) to help everyday conversations and reduce isolation in the hearing world,” he said.

Google is not the only tech firm to attempt to develop a capability for computers to read aloud sign language and hand gestures. Last year, Microsoft partnered with the National Technical Institute for the Deaf to develop an AI that would help students with hearing disability translate their presentations in class through computers.

Home-grown technologies have also been developed in an attempt to bridge the gap between hearing and non-hearing communities. For example, a 25-year-old Kenyan developed haptic gloves that translate sign language to an Android application which reads the text aloud for his niece who is hearing impaired.

Nonetheless, Google promises that it will not stop until they can build a fully inclusive algorithm to help the deaf community. In the blog post, the researchers said: “We plan to extend this technology with more robust and stable tracking, enlarge the amount of gestures we can reliably detect, and support dynamic gestures unfolding in time. We believe that publishing this technology can give an impulse to new creative ideas and applications by the members of the research and developer community at large. We are excited to see what you can build with it!”

Leave a Reply

Your email address will not be published. Required fields are marked *