At Connect 2024, Meta announced several new features that are coming to Ray-Ban Meta smart glasses. Over the past few months, Meta has rolled out several new features, including reminders, voice search for Spotify and Amazon Music content, Be My Eyes integration, adaptive volume, and more.
Today, Meta announced the rollout of the v11 software update, which brings even more features to make Ray-Ban Meta smart glasses even more useful.
With the new live AI feature, Meta AI can now see what users see continuously, and they can have conversations based on the content they see in a natural way. Users can use this capability to get real-time, hands-free help from Meta AI. Users can now ask questions without saying “Hey Meta” and interrupt AI to ask follow-up questions more easily. Meta claims that live AI will be able to give useful suggestions even before you ask in the future.
With the new live translation feature, Ray-Ban Meta smart glasses can translate speech in real time between English and either Spanish, French, or Italian. If someone is speaking in Spanish, French, or Italian, users can hear what they say in English. The same can also be viewed as transcripts on their phone.
Both live AI and live translation features are now available for Ray-Ban Meta glasses owners in the US and Canada who are part of Meta"s Early Access Program. Interested users can enroll here.
With Ray-Ban Meta smart glasses, users can now just say, “Hey Meta, Shazam this song” for hands-free music recognition. This feature is available in the US and Canada.
These new features make Ray-Ban Meta smart glasses even more useful and versatile. With the addition of live AI and live translation, Meta is pushing the boundaries of what smart glasses can do.