- The Ray-Ban Meta glasses just got new Live AI in Early Access
- Shazam music recognition is here too
- All new features are only available in the US and Canada
Meta is rounding out the year with a major update to its Ray-Ban smart glasses with two Live features it teased at Meta Connect 2024. It’s also adding Shazam integration to help you find the names of tunes you hear while wearing your specs.
The only downside of the awesome-sounding Live features are that they’re in early access, so expect them to be less reliable than your typical AI tools. They’ll also only be available to Early Access Program members in the US and Canada. You can enroll at Meta’s official site.
But if you are in the Early Access Program you can now try Live AI and Live Translation.
Live AI is like a video version of Look and Ask. Instead of taking a quick snap, your glasses will continually record your view so you can converse with it with about what you can see – or other topics. What’s more, while in a Live AI session you won’t need to say “Hey Meta” over and over again.
Meta adds that “Eventually live AI will, at the right moment, give useful suggestions even before you ask.” So be prepared for the AI to butt in with ideas without you prompting it directly.