Meta is rolling out a substantial software update for its Ray-Ban and Oakley smart glasses, pushing the device further away from being just a camera on your face and towards a true AI-powered lifestyle assistant.
The headline addition is a new multimodal feature called Look & Play, developed in partnership with Spotify. The feature lets users simply look around and say, “Hey Meta, play a song to match this view.”
The glasses then use their onboard camera and Meta AI to analyze the visual scene—interpreting the mood of a rainy street, a sunset, or a busy coffee shop—and automatically cue up a relevant track on Spotify. It is one of the first consumer examples of visual AI recognition being paired directly with music discovery.
Garmin integration for athletes
Fitness enthusiasts receive perhaps the most practical upgrade. Users of the Oakley Meta Vanguard (and Ray-Ban Meta) glasses can now pair their eyewear with compatible Garmin watches to control workouts via voice.
Instead of fumbling with a watch mid-run, users can issue commands like “Hey Meta, create a 1-hour run at 14 minutes per mile,” or “Hey Meta, let’s do a bike ride.” This hands-free interaction addresses a common pain point for runners and cyclists, allowing them to start and modify tracked sessions without breaking stride.
Expanding accessibility
Meta is also making the glasses more usable for European customers. Voice controls for major music services—including Spotify, Apple Music, and Amazon Music—are now supported in French, Italian, German, Spanish, and Portuguese.
Additionally, new ‘shorter shortcuts’ allow users to trigger actions with single words (like “Photo” or “Music”) without the “Hey Meta” wake word. However, this is currently rolling out for English users first.
The update began rolling out on December 16, with the features arriving gradually over the coming days and weeks.



