Google recently showcased its innovative XR smartglasses for the first time at a TED Talk. While prior displays featured highly refined videos of Project Astra, this live demonstration offered attendees a genuine understanding of the product's functionalities in a real-world context. The excitement surrounding this glimpse into the future is palpable, although it’s important to recognize that we're still discussing innovations that are a work in progress.
The bulk of the 16-minute presentation was dedicated to demonstrating the smartglasses, introduced by Shahram Izadi, Google’s VP of augmented and extended reality. He provided some context regarding this project, which is anchored by Android XR, an operating system that Google is developing in partnership with Samsung. Android XR is designed to incorporate Google Gemini into various XR hardware, including headsets and smartglasses, as well as in forms yet to be imagined.
The demonstration utilized a striking pair of smartglasses, featuring polished black frames akin to the Ray-Ban Meta smartglasses. These glasses come equipped with a camera, speaker, and microphone, allowing the integrated AI to perceive its surroundings. Through a connection to your smartphone, users can handle calls effortlessly. Unlike the Ray-Ban Meta, these glasses boast a compact color display within the lens.
Headset and Glasses

One of the most intriguing aspects presented during the demonstration was the smartglasses' ability to utilize Google Gemini's short-term memory. The AI can accurately identify the title of a book the wearer glanced at and remember the location of a hotel keycard. This innovative memory function serves various practical purposes, including acting as a memory aid and helping users confirm details or manage their schedules more effectively.
The AI's capabilities extend further, allowing it to interpret a book diagram and translate text to different languages, as well as provide real-time translation of spoken languages. In one segment of the demonstration, Gemini showcased its navigation functionality by displaying directions to a nearby scenic spot right on the lens, reacting swiftly to user prompts and operating seamlessly throughout the event.

After showcasing the smartglasses, the presentation moved on to a full headset operating on Android XR. This visual experience closely resembles Apple’s Vision Pro headset, featuring multiple windows and pinch gestures for navigation. However, the standout feature is Gemini's conversational interaction, where it proactively describes and interprets the visual information presented to the wearer.
Availability

Izadi concluded the presentation by expressing that, "We're entering a thrilling new era of the computing revolution. Headsets and smart glasses are just the initial steps. This all points towards a unified vision for the future—a realm where intelligent AI converges with lightweight XR technology. XR devices will become more wearable, enabling instant access to vital information. In parallel, AI will evolve to be more contextually aware, conversational, and personalized, collaborating with us seamlessly in our own language. We're shifting from augmenting reality to enhancing intelligence."
The prospects are exciting, especially for those who appreciated previous innovations like Google Glass and are already enjoying Ray-Ban Meta's offerings. The smartglasses, in particular, show great promise as the next evolution in functional smart eyewear. However, it’s worth emphasizing that these advancements are still firmly in the future; while the glasses appear nearly ready for their public debut, this may not indeed be the case as Google prolongs the anticipation for its smart eyewear.
During the TED Talk, Izadi did not provide a specific launch date for either XR device, a potentially troubling sign. When might they truly become available to consumers? According to reports, the smartglasses are part of a collaborative effort between Google and Samsung, with the device expected to be released in 2026—a timeline that extends beyond the previously suggested 2025. While this may seem far off, it's actually closer than the expected consumer release of Meta’s Orion smartglasses, anticipated for late 2027.
Will It Be Too Late?

The smartglasses presented during the TED Talk drew together elements from prior iterations such as Glass and Ray-Ban Meta, along with distinguishing features from alternatives by Halliday and the capabilities of the existing Google Gemini assistant. The ongoing wait for release feels unexpectedly extended and somewhat frustrating.
Additionally, the increasing amount of AI-driven hardware being introduced, alongside a surge of Ray-Ban Meta alternatives expected to hit the market by 2026, places Google and Samsung's products at risk of losing their novelty or being perceived by the audience as outdated. Meanwhile, the Android XR headset, known as Project Moohan, is anticipated to debut in 2025.
While it may be a matter of impatience, seeing a finished demo of a product that looks so promising makes the desire for an earlier release inevitable.