At the Google I/O, a lot of interesting innovations are coming for Google’s artificial intelligence technology that’s supposed to make it a lot more smarter and more practical in day-to-day use.
As Rishi Chandra, the vice president of product at Google says on a report, “Every decade, there’s a big computing shift. Twenty years ago, it was the transition to the web. Ten years ago or 12 years ago, it was the transition to mobile. And now, we think we’re in the third stage, which is AI or ambient computing.”
Certainly, the influx of AI-driven technology is now starting to take over the tech industry and everyone’s scramming to develop their own. We’re here to break it down and tell you all the fun facts as Google heads on to the future of AI.
Throughout recent years, there has been promises over voice-driven AI tech that’s meant to make life easier by just saying a few trigger words. During an on-stage demo at the Google I/O, saying the words “Hey, Google” apparently becomes more useful and are now building on the foundation of what voice assistants can start doing.
As announced, the Google Assistant in the future will be able to perform tasks way beyond the superficial. For easy digestion, the Google Assistant will help you do tasks without the need to direct it. In further months, it won’t only be useful to do simple searches but will also help you in composing a message to a friend including putting in a subject line, searching for a picture to attach, and sending it off, among others.
The Google Assistant is also supposed to become faster through advanced AI processing. Google is able to narrow down AI models used to listen and interpret speech locally without the need to wait for data to be processed from remote servers.
However, the new Google Assistant won’t be available just yet. It is said to launch later in the year starting with Pixel phones.
In supplement to the Google Assistant, it will also be integrated with its own ‘Driving Mode’ feature. Although the Google Assistant is already available in Android Auto and Maps, it will now also be available in Waze.
By simply saying “Hey, Google. Let’s drive,” it will prompt features about driving-relevant activities, top contacts, and more personal recommendations on any Android device with the Google Assistant.
With Google Assistant’s driving mode, you will be able to access personalized recommendations. For example, you have a reservation at a restaurant or a planned trip on the database, the Google Assistant will set up the fastest and the safest route to get you to your location.
Duplex is another feature that involves the Google Assistant. Last year, Google introduced Duplex as a voice assistant that will be able to make calls and transactions on your behalf based on the information it has on you. It also has an impressive human-like AI voice to do it.
This year, Google says that it’s expanding Duplex even with web-based transactions such as chats and etc. It will be able to make you car rental reservations, movie ticket purchases or restaurant breservations. It will base information by pulling up your travel record and will auto-fill required information like your name, address, and arrival and departure time.
Google says that you will be confirmed after the transaction regarding how it went. It will indicate of a transaction was not successful due to unavailability, walk-ins only, etc.
At the moment, Google says that the Duplex is more of a preview for what is to come and have not said a concrete date on when we actually get to see and experience the tech.
Google Lens is a Google app that helps you comprehend information from a restaurant menu or a piece of paper just by taking a picture. This year, Google says that it can also search for photos of the dish from the menu by pulling up image results online based on information on your Google Maps to let you have an idea what it actually looks like before ordering.
There’s also a calculator function that should come in handy when paying the bill. By simply taking a photo of the receipt, Google Lens will be able to help you split the bill and add a tip.
Moreover, the Google Lens app will help put images to life. By hovering the app over Google-partnered magazines like Bon Appétit Magazine, images will start to animate and show you recipes and the like.
Lastly, the Google Lens translation tech is also getting an update by integrating it with Google Translate. Simply taking a photo of the text will prompt Google Lens to translate it. There’s also a ‘Listen’ button you can tap that will read the text in the translated version all the while highlighting the word/s it is translating.