Meta smart glasses have received a major update, introducing "conversation focus" to instantly transform into a hearing aid? They can even automatically play Spotify music while viewing scenery.
Meta recently confirmed that it has begun releasing a new wave of software updates for its smart glasses product line (including the Ray-Ban Meta and Oakley Meta series). This update brings two killer features: "Conversation Focus" and Spotify integration with multimodal AI. The initial rollout will be to users participating in the Early Access program. A Savior in Noisy Environments: "Conversation Focus" This feature essentially gives smart glasses the ability to reduce ambient noise and enhance human voices, similar to a hearing aid. When users are in crowded or noisy environments (such as restaurants or parties), activating this feature allows the glasses' microphone array to use beamforming technology to lock onto and amplify the voice of the person directly in front of them, while reducing background noise. Meta's official description states that the enhanced voice sounds slightly "clearer," helping users hear more clearly. Users can quickly activate it via the voice command "Hey Meta, start Conversation Focus" or by setting a touch shortcut by long-pressing the temple. AI DJ at your fingertips: Spotify plays whatever you watch. Another interesting update is the deep integration with Spotify. Through multimodal AI technology, you can now have your glasses act as your personal DJ. Simply say, "Hey Meta, play a song to match..."







