Google’s New Dining And Translation Utilities

ad1

Google is rolling out a new set of utilities for its Google Lens feature that would primarily be used to improve user’s dining experience and provide a visual-based translation filter this week.

Last year, Google impressed its audience at Google I/O with Duplex. Google Duplex is an AI-powered machine that features a human-sounding robot that can make calls, schedule appointments, check the hours of operation at a business, or book restaurant reservations on your behalf. Duplex would then inform you about the call’s outcome. Moreover, the chat-bot is human-like to the point where the person at the other end of the line wouldn’t notice that they’re talking to a robot.

As of the moment, Duplex remains to be an impressive preview as Google has yet to announce when they are releasing the utility.

Google Lens Menu
Photo From: GoogleLens.com

On this year’s Google I/O, Google teased about furthering AI technology to push other utilities to function in people’s everyday lives. One of which was the upcoming tools using its Google Lens technology.

Google Lens is an app that helps you comprehend information from a restaurant menu or a piece of paper just by taking a picture. Comparatively, it’s similar to the Google Assitant. However, Google Lens is the visual version of the prior that would render services through intuitive machine-learning technology.

Moreover, Slash Gear says that it should resemble the discontinued Google Goggles that lets you see the world through Google’s eyes. “Every real-world object becomes a point of interest that, thanks to a mix of machine learning, computer vision, and Google’s Knowledge Graph.”

In instances, where you have decided to eat at a new restaurant, by facing your camera at the menu with the name of the restaurant, the new feature will scan images on the list and cross-reference them with data collected in Google Maps to provide you actual photos of the dish. Furthermore, clicking on the picture, it shows reviews about the meal to give you the full scope before ordering.

There’s also an in-camera calculator function that would come in handy when paying the bill. By taking a photo of the receipt, Google Lens will be able to help you split the bill and add a tip.

The Google Lens app will help put images to life. By hovering the app over Google-partnered magazines like Bon Appétit Magazine, images will start to animate and show you recipes and the like.

Google Lens Translate
Photo: GoogleLens.com

The second “dining utility” from Google Lens is the new Lens translation feature that will be integrated with Google Translate. By taking a photo of the text on a menu or a piece of paper, it will prompt Google Lens to translate it. There’s also a ‘Listen’ button you can tap on that will read the text in the translated version. It will also highlight the word/s it is translating on the on-screen text.

The Google Translate app has long enabled language translation of signs that matches the style and typeface of what’s being parsed — but now a more lightweight version of this functionality will be available directly into Lens.

The Lens team has been working with early testers in India and is working to make the technology lightweight enough that it can run on less-robust phones. Google said the tech stack is just 100 kilobytes reports TechCrunch.

Google Lens is available on Android via Google Assistant and the Google Photos app. iOS users can access it via the Google app and the Google Photos app. These new filters are rolling out to all users, but Android phone owners should note that it will only work for them if they’re phone is ARCore-compatible.

The new Google Lens “dining” features may find itself most useful for people who travel often or people who eat at exotic restaurants. It’s most likely that these aren’t utilities users would use most often. Nonetheless, they’re both intuitive utilities that should find itself handy when the circumstances arise.

Meanwhile, there still are some more AI-powered utilities announced during this year’s Google I/O like Driving Mode, similar to Waze, and new updates on a better and smarter Google Assistant to look forward to.

Leave a Reply

Your email address will not be published. Required fields are marked *