Select Language:
Are you fed up with constantly barking orders at your AI assistant? Google’s latest universal AI assistant could be the answer to that dilemma—and much more.
AI Could Soon Take Control of Your Android Phone
Google has unveiled its ambitious concept for a “universal AI assistant” capable of grasping the context around you, devising solutions, and executing them autonomously. The aim is to develop an all-seeing, all-hearing assistant that knows when to step in without requiring a manual prompt.
This new assistant is named Project Astra. During I/O 2025, Google showcased some stunning demonstrations of its capabilities. In one instance, a user faced issues with their bike brakes and asked Astra to locate the bike’s online manual.
After finding the manual, Astra was instructed to scroll to the section about the brakes, which it did flawlessly. The user then requested Astra to search for a video tutorial on YouTube and even to get in touch with the local bike shop for information on the necessary parts. Astra could even call the nearby bike shop on the user’s behalf to check if the needed components were in stock.
The Verge reported another demo where Bibo Xiu, a product manager from the Google DeepMind team, pointed her phone camera at a pair of Sony headphones and asked Astra to identify them. Astra responded, saying they were either WH-1000XM4 or WH-1000XM3—confusing labels that many humans might struggle with, too.
Once identified, Xiu requested Astra to pull up the manual and explain how to pair the headphones with her phone. She then interrupted the assistant mid-explanation and asked it to pair the headphones for her. Unsurprisingly, Astra handled the task without a hitch.
The demos suggest that Astra is simulating screen interactions to navigate the device. Indicators from the screen recordings imply that Astra reads the screen and determines where to navigate as it completes tasks.
A Universal AI Assistant Is on the Way
While these demos are impressive, they’re not flawless. User input is still necessary, and, in Xiu’s case, she had to manually grant Astra access to her phone’s screen.
Currently, Project Astra serves as a testing ground for Google’s most ambitious AI ideas. Features that perform well in these tests often trickle down to applications like Gemini, making them available to the public. Google envisions ultimately transforming the Gemini app into a universal AI assistant capable of handling everyday tasks.
Google is diligently working on gradually phasing out outdated tools for newer, AI-enhanced alternatives. AI Mode is replacing traditional Google Search, and Gemini already features an array of impressive capabilities for users to explore.
That being said, even today’s most advanced AI systems require you to provide prompts at each step and offer necessary data and context; manual intervention may still be necessary. Astra aims to simplify these inputs by drawing on your information from various platforms, enabling it to create the context needed for action.
Achieving this goal won’t be easy, and privacy and security concerns are paramount with a universal AI assistant like Astra. While Astra could potentially handle many tasks locally using the Gemini Nano model, the demos haven’t indicated as much.
Creating an assistant of this caliber will take time. However, these demonstrations give us a glimpse into the future. A universal AI assistant may not arrive tomorrow, but it’s definitely on the horizon, and I’m looking forward to its debut.