Tired of shouting commands at your AI assistant? Google’s next-generation universal AI assistant can solve that problem for you, and more.
AI Might Soon Control Your Android Phone
Google has demonstrated its vision for a “universal AI assistant” that can understand the context around you, come up with a solution, and carry it out on your behalf. The goal is to create an all-seeing, all-hearing assistant that can automatically figure out when it’s needed and chime in without you having to summon it manually.
This new assistant is called Project Astra, and Google has shown some pretty impressive demonstrations of what it can do at I/O 2025. In one demo, a user is having problems with their bike’s brakes and asks Astra to find the bike’s user manual online.
Once Astra finds the manual, it is then asked to scroll until it finds the section covering the bike’s brakes, and it does so flawlessly. The user then asks Astra to look up a video tutorial on YouTube and even contact the bike shop to find information on what parts they need. Astra is even able to call the nearest bike shop on the user’s behalf and ask if the required parts are in stock.
The Verge also reports seeing a demo where Bibo Xiu, a product manager on the Google DeepMind team, pointed her phone camera at a pair of Sony headphones and asked Astra to identify them. Astra answered, saying they’re either the WH-1000XM4 or the WH-1000XM3, a confusion I’ll bet most humans will have, too.
Once identified, Xiu asked Astra to pull up a manual and explain how to pair them to her phone, only to interrupt the AI assistant mid-answer and ask it to pair the headphones for her. As you can probably guess, Astra obliged without any issues.
From the demos, it seems that Astra is simulating screen inputs to move around the screen. The screen recording indicators also suggest that Astra reads your screen and decides where to go, navigating different user interfaces as it goes about its task.
A Universal AI Assistant Is on the Horizon
While impressive, these demos aren’t perfect. They still require user input, and in the case of Xiu’s demo, she had to manually enable a feature giving Astra access to her phone screen.
At the moment, Project Astra is more of a testbed for Google’s wildest AI ambitions. Features that perform well here eventually trickle down to tools like Gemini and are made available to us. Google claims that its ultimate vision is to “transform the Gemini app into a universal AI assistant that will perform everyday tasks for us.”
Google is hard at work, slowly phasing out older tools in favor of newer, AI-powered ones. AI Mode is replacing Google Search, and Gemini already has a list of impressive features you should try.
That said, even the most advanced AI systems of today require you to enter prompts at each step, provide them with the necessary data and context, and you might still need to take manual action. Since Astra can access the internet and Google’s services, it’s looking to replace all these inputs by accessing your information from different platforms and building the context it needs to take action.
This is not an easy goal to achieve, and don’t even get me started on the privacy and security issues a universal AI assistant like Astra can potentially raise down the line. Astra could be doing all the heavy lifting locally using the Gemini Nano model, but the demo shows no hint of that being the case.
Building an assistant like this is going to take a fair bit of time, but with these demos, Google has shown us a glimpse of the future. It may not be coming soon, but a universal AI assistant is on the horizon, and I am eagerly awaiting its arrival.
Leave a Comment
Your email address will not be published. Required fields are marked *