In recent months, major tech companies have been racing to develop the first mass-market augmented reality (AR) glasses, and Google is making significant strides in this competition. At the TED2025 conference, Google revealed a new pair of XR glasses, demonstrating advanced capabilities that could reshape the landscape of augmented reality. This comes on the heels of their collaboration with Samsung, focused on creating an Android-centric pair of glasses.

During the event, Shahram Izadi, the Vice President and General Manager of XR at Google, took to the TED stage wearing what initially seemed to be a typical, albeit large, pair of glassesa style reminiscent of Meta's unreleased Orion smart glasses. However, these were no ordinary glasses. Izadi claimed that the device was displaying his speech notes in real-time as he spoke, showcasing its functionality as a hands-free assistant.

The XR glasses are equipped with advanced features, including a microphone, a camera, and speakers designed to gather information seamlessly. This marks Googles first foray into the realm of XR since the ill-fated Google Glass project. A highlight was the in-lens display, which Izadi briefly showcased, stating, its very, very small. This suggests that Google is experimenting with Waveguide technology, similar to that found in devices like the latest RayNeo glasses. These glasses operate on Android XR, Googles proprietary operating system tailored for extended reality devices.

At the conference, product manager Nishtha Bhatia demonstrated the glasses' camera capabilities, tapping one arm of the device to activate a feature that displayed the Gemini logo on her screen. Almost instantaneously, an AI-powered chatbot emerged, ready to engage with the audience through a humorous haiku about their glowing faces. One of the standout features of the Gemini technology is its ability to translate text seen through the camera into various languages, although Izadi cautioned that the accuracy of this feature might vary. Additionally, the camera can interpret text and graphics, transforming them into easily digestible audio snippets.

Another intriguing aspect of the presentation was the glasses' memory feature, which enables the AI to recall recent visual information captured by the cameraakin to capabilities demonstrated by Google DeepMind in Project Astra last year. As Google continues to integrate features from Astra, including photo and video recognition, into the Gemini Live chatbot, it appears that these advancements will be central to the upcoming AR glasses.

Moreover, the glasses are designed to easily connect with smartphones, offering access to all mobile applications. The integration with Google services is particularly notable; during the presentation, Bhatia requested the glasses to identify a song from rapper Teddy Swims, prompting the glasses to launch YouTube Music and play the track effortlessly. This functionality echoes the appeal of the Ray-Ban Meta glasses, which provide solid audio without the need for additional earbuds. Even more impressively, the internal display can interact with Google Maps, providing a semi-holographic view of street-level navigation.

Behind the scenes, Google has been diligently working on AR glasses for several years, even following the setback of Google Glass. Interestingly, the devices shown at TED may not carry the Google brand directly. Instead, the tech giant is collaborating closely with Samsung on Project Moohan, which aims to produce smart glasses powered by Android XR. During the TED2025 event, Google also presented a prototype of Samsungs headset, which appears to resemble Apples Vision Pro but with enhanced Gemini features.

Samsung has indicated that it is also developing another pair of smart glasses, but speculation suggests that the device demonstrated at TED is not the one expected to launch later this year. Reports from the Korean publication ETNews suggest that the forthcoming device may forego a display or buttons entirely, relying instead on voice commands and gesture controls.

Meanwhile, Meta is reportedly working on its own premium smart glasses that would feature a small screen dedicated to an app shelf on the bottom right lens. As these companies strive to differentiate their products, the true challenge lies in balancing display size, battery life, and weight. If Google can successfully navigate these challenges, we may soon see a new era of AR technology that allows users to immerse themselves in augmented experiences without the burden of cumbersome headsets.