Meta is rolling out live AI and Shazam integration to its smart glasses

The Ray-Ban Meta smartglasses already work well as a head-mounted camera and pair of open-ear headphones, but now Meta is updating the glasses with access to live AI without the need for a wake word, live translation between several different languages, and access to Shazam to identify music.

Meta first demonstrated most of these features at Meta Connect 2024 in September. Live AI lets you start a “live session” with Meta AI that gives the assistant access to whatever you’re looking at and lets you ask questions without having to say “Hey Meta.”

If you need to keep your hands free to cook or fix something, Live AI is going to keep your smartglasses useful even if you need to focus on whatever you’re doing.

Live translation lets your smartglasses translate between English and French, Italian, or Spanish. If Live Translation is enabled and someone speaks to you in one of the selected languages, you’ll hear what they’re saying either in English through the smartglasses’ speakers, or as a typed transcript in the Meta View app.

You’ll need to download specific models to translate between each language, and Live Translation needs to be enabled before it really acts as an interpreter, but it feels much more natural than holding up your phone to translate something.

With Shazam integration, your Meta smartglasses will also be able to recognize any songs playing around you. A simple “Meta, what song is this” will help the smartglasses’ microphone figure out whatever you’re listening to, much like using Shazam on your smartphone.

All three updates move the wearable device toward Meta’s ultimate goal of becoming a true pair of augmented reality glasses that can replace your smartphone, an idea of ​​which the experimental Orion hardware is a real-life preview. Connecting AI and VR and AR seems to be an idea that many tech giants are considering as well.

Google’s newest XR platform, Android XR, is built around the idea that a generative AI like Gemini could be the glue that makes VR or AR engaging. We’re still years away from any company being ready to replace your field of view with holographic images, but smartglasses seem like a moderately useful stopgap in the meantime.

All Ray-Ban Meta smartglasses owners will be able to enjoy Shazam integration as part of Meta’s v11 update. For live translation and live AI, you’ll need to be part of Meta’s Early Access program, which you can join right now on the company’s website.

Meta has removed its AI-generated profiles from Facebook Instagram, the company has confirmed, after the AI ​​characters prompted widespread outrage and derision from users on social media.

The AI-generated profiles, labeled as “AI managed by Meta,” launched in September 2023, starting alongside the company’s celebrity-branded AI chatbots (also discontinued). It appears Meta hasn’t updated any of these profiles in several months, and these pages went unnoticed until this week, following an interview published by the Financial Times with Meta’s VP of Generative AI, Connor Hayes.

In the interview, Hayes talked about the company’s goal to eventually fill its services with AI-generated profiles that can interact with people and “function in the same way as accounts do.” Those comments drew attention to existing fMeta-created AI profiles and, well, users weren’t exactly impressed with what they found.

Leave a Comment

Your email address will not be published. Required fields are marked *

error: Content is protected !!
Scroll to Top