Google Unveils Innovative AI-Powered Smart Glasses at TED 2025

Imagine being in a hotel room, only to realize you can't leave because you've misplaced your key card. Now, picture an AI assistant embedded in your smart glasses, guiding you to its last known location. This intriguing scenario is becoming a reality thanks to a new line of Android XR smart glasses showcased by Google at TED 2025.
During the conference, Shahram Izadi, Googles head of augmented reality and extended reality, introduced these smart glasses that promise to revolutionize how we interact with technology in our daily lives. Although sketches and concepts of these glasses had been teased before during the Google I/O event, this presentation marked the first live demonstration of their capabilities on stage, generating significant buzz among tech enthusiasts.
These cutting-edge glasses are powered by Android XR, a fresh iteration of the Android operating system designed to extend the boundaries of reality by amalgamating Augmented Reality (AR) and Virtual Reality (VR). Android XR aims to become as popular and ubiquitous as Googles Wear OS or Android Auto. While it is suspected that these may be the Samsung HAEAN smart glasses, Google representatives did not confirm a specific product name during their presentation.
The live demonstration featured Google product manager Nishtha Bhatia asking the AI assistant, named Gemini, about the whereabouts of her hotel room key card. In an impressive display of AI capability, Gemini promptly responded, The hotel key card is to the left of the music record on the shelf behind her. While details regarding the duration and capacity for memory recording remain unclear, this feature holds great potential as both an application of AI technology and a groundbreaking use for smart glasses.
The possibilities arising from this virtual memory feature are astounding. For example, consider the plight of individuals suffering from cognitive impairments, such as dementia. The ability for smart glasses to assist users in remembering everyday objects and tasks could significantly enhance their quality of life, providing invaluable support that was previously unimaginable.
This memory feature was initially hinted at during the Google I/O 2024 event, where an early version of a project referred to as Project Astra was demonstrated on a smartphone. For those interested in seeing the original demonstration, including the impressive memory application, it can be viewed at the specified timestamp in a video shared online.
Izadi elaborated on the technical aspects of the glasses, stating, These glasses work with your phone, streaming back and forth. This means the glasses will likely feature minimal onboard processing, thereby extending the battery life while still allowing access to various phone applications. Project Astra is anticipated to be publicly launched soon, likely continuing to operate via smartphone while the smart glasses capture images and information from the users surroundings.
Furthermore, the glasses presented at TED 2025 included at least one lens equipped with a display. This capability was briefly demonstrated during a live translation session, where words were visually displayed on the lenses. In August 2023, reports surfaced indicating that Google is developing multiple versions of smart glasses, including designs with a single lens display and others featuring dual displays, enabling richer user experiences.
The competition in the field of next-generation smart glasses is heating up. Meta is reportedly set to unveil its own advanced Ray-Ban-branded smart glasses later this year, which will incorporate a single display and a complementary wristband for hands-free interaction with the displayed content.
While Google has not announced plans for a similar wristband accessory, it is likely that the company will depend heavily on the Gemini AI to facilitate most of the user interactions. I personally experienced Project Astra at Google I/O last year and was genuinely impressed by how adeptly the AI comprehended spatial relationships and interpreted visual information through the camera.
Google plans to leverage this innovative functionality in their upcoming smart glasses. However, it remains uncertain whether the glasses unveiled at TED 2025 will be part of the 2025 Android XR lineup or if consumers will need to wait until next year for them to become available in retail.