Meta has unveiled an important software update for its Ray-Ban and Oakley AI glasses, introducing features aimed at enhancing speech clarity in noisy settings and a groundbreaking “multimodal” music experience.
The v21 software update, which is being rolled out this week, signifies a transition for wearable technology from being merely a social media accessory to a more practical daily assistive tool.
Highlights
Enhancing Conversations
The main feature, ‘Conversation Focus’, addresses a prevalent issue faced by headset and earbud users: the difficulty of hearing someone in a busy environment. By utilizing the glasses’ integrated microphone array and on-device AI, the technology generates a directional “audio zone”.
This system effectively identifies and isolates the voice of the individual directly in front of the wearer, amplifying it through the open-ear speakers while minimising background noise such as restaurant chatter or the rumble of public transport.
Users can customise the amplification level, adjusting it to match their surroundings. This feature is particularly beneficial in “challenging acoustic scenarios” like bustling bars, clubs, or during commutes.
‘Soundtrack Your World’ with Spotify
Alongside the audio improvements, Meta has collaborated with Spotify to introduce what is described as the first multimodal AI music experience.
By leveraging the glasses’ cameras, the AI can now interpret the user’s environment to recommend music. Users can simply voice a command, such as “Hey Meta, play a song to match this view,” when looking at an object or scene, like a particular album cover or a festive holiday setting, prompting the creation of a customised Spotify playlist that aligns with their personal music preferences.
Regional Launch and Language Options
While the integration with Spotify is debuting in English across various markets, including the UK, Australia, Canada, and India, the Conversation Focus feature is initially accessible only to individuals in the United States and Canada who are part of Meta’s Early Access Programme.
Additionally, Meta is broadening its language support in India by incorporating Telugu and Kannada. This builds upon the existing capabilities in English and Hindi, enabling a larger user base to interact with the assistant, capture media, and manage calls using their local language. A representative from Meta stated that the update highlights the company’s dedication to “sustainable educational technology” and aims to make AI “ubiquitous” through functional, everyday devices.
This update is currently being rolled out to Early Access participants and is anticipated to reach a wider audience in early 2026.
