Ray-Ban Meta Gets Live AI, RT Language Translation, Shazam

Meta has added new features to Ray-Ban Metas in time for the holidays via a firmware update that make the smart glasses “the gift that keeps on giving,” per Meta marketing.  “Live AI” adds computer vision, letting Meta AI see and record what you see “and converse with you more naturally than ever before.” Along with Live AI, Live Translation is available for Meta Early Access members. Translation of Spanish, French or Italian will pipe through as English (or vice versa) in real time as audio in the glasses’ open-ear speakers. In addition, Shazam support is added for users interested in easily identifying songs.

“Live AI allows you to naturally converse with Meta’s AI assistant while it continuously views your surroundings,” The Verge writes, adding that “users will be able to use the Live AI feature for roughly 30 minutes at a time on a full charge.”

In addition to seeing what you see, Live AI can also record video of your line of sight and talk to Meta AI about what’s there, like asking questions about restaurants or shops that you’re strolling past — though the company warns in a blog post that its AI “might not always get it right” and says improvements are ongoing.

“You can ask questions without saying ‘Hey Meta,’ reference things you discussed earlier in the session, and interrupt anytime to ask follow-up questions or change topics. Eventually Live AI will, at the right moment, give useful suggestions even before you ask,” Meta explains. For example, if you’re shopping for groceries you can ask Meta AI to provide recipes for items you’re looking at.

The real-time AI language translations can also appear as transcripts on a synced smartphone. But The Verge points out that you “have to download language pairs beforehand, as well as specify what language you speak versus what your conversation partner speaks.”

“Real-time AI video for Ray-Ban Meta was a significant focus of Meta’s Connect dev conference early this fall,” writes TechCrunch, noting that the tech was “positioned as an answer to OpenAI’s Advanced Voice Mode with Vision and Google’s Project Astra” and “allows Meta’s AI to answer questions about what’s in view of the glasses’ front-facing camera.”

TechCrunch adds that these updates make Meta “one of the first tech giants to market with real-time AI video on smart glasses,” noting that while Google’s says it plans to sell AR glasses it hasn’t committed to a timeline.

Related:
Smart Glasses Won Me Over, and This Is the Pair That Did It, The Wall Street Journal, 12/17/24

No Comments Yet

You can be the first to comment!

Leave a comment

You must be logged in to post a comment.