Google New Glasses Can Translate Speech in Real Time

Google New Glasses Can Translate Speech in Real Time By Vivek Hansdah - May 22, 2022
Google

The latest offering from Google’s parent company Alphabet comes 10 years after its first eyewear product

The latest offering from Google’s parent company Alphabet comes 10 years after its first eyewear product

Google has unveiled a new product at its I/O developer conference this month; a level up from its Google Glass automated eyewear that was launched a decade ago. Instead of acting like a wearable computer, Alphabet Inc’s new augmented reality (AR) glasses can automatically translate speech in real time. 

Harnessing Google Translate, the glasses can listen to speech in different languages, or detect American Sign Language, and project a translation in front of a user’s eyes – like real-time subtitling. Not only could this be applied to people wishing to speak to someone from another country, but it could also prove a valuable tool to deaf or hard-of-hearing users in cases where they do not or cannot wear hearing aids. 

While the glasses remain in the prototype phase, and the company hasn’t provided any information on when it may be ready for commercialization, it is nonetheless expected to provide an indication of where the AR device market is heading. 

Alphabet’s first foray into Google glasses encountered some backlash, with the integrated camera causing privacy concerns and the high price causing some consumers to balk. Details on the new translation glasses remain sparse, and it is unclear how the company has changed its approach since its first automated glasses product, though in design alone they mark a departure; shifting from a more sci-fi look to traditional glasses.

The new device was one of many products unveiled by Alphabet at the conference, all of which reportedly have the intention of more holistically connecting Google services with real-world activities using AI. 

Source: IoT World Today

By Vivek Hansdah - May 22, 2022

Leave a comment

r