OPINION: Google I/O saved the best for last with its Android XR Glasses demo that showed how Gemini AI is what Google Glass needed all along
Google has gone into detail on how it’s Google Gemini AI can help to transform the Android XR wearable display platform it is working on with Samsung and Qualcomm.
Google says Android XR will enable users to experience naturally while wearing smart glasses and offered a live demonstration on the Google I/O stage.
For the main part, it was an incredible demonstration of the forthcoming technology and the absolute highlight of the AI-heavy Google I/O presentation.
Offering by far its deepest look yet at Android XR smart glasses, Google said these devices with cameras, microphones, speakers and optional displays make them the perfect conduit for using Gemini in the real world without having to pick up your phone.
“Pairing these glasses with Gemini means they see and hear what you do, so they understand your context, remember what’s important to you and can help you throughout your day,” Google said in a blog post. And, for the first time it totally sold me on the idea of wearing a pair of these devices.
During the demo, Google product manager Nishtha Bhatia was able to quickly send a text message via voice and then ask Gemini to silence notifications. Because a red light was lit on the glasses, fellow employees were able to see the demo was being streamed too.
This resolved a real sticking point with the original Google Glass produce that arrived well before its time more than a decade ago and scared a lot of people off due to privacy concerns.
Nishtha was able to look at a wall showing photos of a band performing and ask “who are this band and how are they connected to this place?” After the explanation she was able to ask to see a photo of them playing “here” while Gemini then asked if she wanted to hear one of their songs. It was an expression of just how incredible Gemini has become at detecting context compared to first generation voice assistants like Google Assistant.
She was then able to ask “what was the name of the coffee shop on the coffee I was drinking earlier?” and was able to glean an accurate response presumably because the camera had seen her drink it. She was also able to ask for detail about the shop itself – photos of the interior, time to walk it, and walking directions. Those appeared on a 3D map in her eye-line. She then asked Gemini to send a calendar invite to her friend to meet her there in an hour. All completely seamlessly.
Without lifting a finger
The company also showed the live language translation feature when walking next to someone speaking in a different language offering subtitles within the field of view. It didn’t work as well on stage, as live demos tend not to, but it was enough to get the idea. Imagine walking along with a person and just conversing that naturally in real time? Still being able to make eye contact.
Obviously there’s a connectivity reliance, but there was no need to take the phone out of the pocket. Not ever. It convinced me this could be a device I would want to wear that would prevent me using my phone as much, allow me to be more present and get stuff done faster. No messing around on fiddling with a handset, I’d be able to stay in the moment.
Once Gemini’s Personal Context capabilities – also discussed at Google I/O – come to this platform it’ll be an device that knows you and your preferences and your schedule so innately this will all become seamless and effortless.
And because Google has got Samsung on board with making them, as well as stylish specs manufacturers like Warby Parker, they’re going to be highly wearable with prescription lenses. You’ll be able to walk around in these looking relatively normal. They don’t appear to be that cumbersome.
While most of Google I/O just felt like people applauding the loss of entire industries to artificial intelligence, this at least allowed me to exit the keynote feeling a little bit lifted. There was no augmented reality unnecessarily plastered in front of your field of view. It was all on request, it was all pertaining to what you needed, not one what Google wanted to show you.
I think I’ve seen the future, and it’s Android XR glasses.
Leave a Comment
Your email address will not be published. Required fields are marked *