AI Glasses have taken centre stage as Google unveils the future of wearable technology during a live TED Talk. Powered by Gemini, these experimental glasses mark a significant step forward as the company looks to enhance its AI ecosystem beyond smartphones and desktops.
In the TED Talk, Shahram Izadi, who serves as Vice President and General Manager of Android XR at Google, introduced what seems to be the most sophisticated wearable prototype the company has produced so far. These new AI Glasses, which bear resemblance to standard optical eyewear, feature integrated camera sensors, speakers, and a subtle display interface. The glasses leverage Google’s Gemini AI, allowing them to perceive user surroundings and respond to live inquiries, such as crafting a haiku based on the expressions observed in a crowd.
A memory feature, initially unveiled with Project Astra, was demonstrated, showcasing Gemini’s ability to “remember” items and settings even after they go out of sight. Google claims this visual memory can persist for up to 10 minutes, facilitating enhanced contextual assistance for users.
Previously, in December 2024, Google hinted at the notion of XR (Extended Reality) glasses, developed in partnership with Samsung. The company expressed that “Created in collaboration with Samsung, Android XR combines years of investment in AI, AR and VR to bring helpful experiences to headsets and glasses.”
In an interview with 60 Minutes, Demis Hassabis, the CEO of Google DeepMind, disclosed that the memory features of Gemini might soon find a place in Gemini Live, a tool designed for real-time, two-way voice communication that currently has the ability to respond to live video feeds. At this moment, Gemini Live does not support contextual memory retention, but changes may be imminent.
Hassabis also hinted at potential future upgrades that could enable social responsiveness, with Gemini delivering personalised greetings when activated.
While still in the prototype phase, the AI Glasses seem capable of undertaking more intricate tasks beyond merely answering queries. Early examples suggest users might be able to engage in online transactions or delve into deeper levels of AI interaction.
Although Google has yet to reveal a release schedule for the public, these innovations highlight its renewed commitment to the AI wearable sector, which was first ventured into with Google Glass more than ten years ago.