Meta has updated its second-generation Ray-Ban smart glasses, created in partnership with EssilorLuxottica, with a new round of styles, including the Skyler frame, and improved their usability — just in time for sunglasses season.
The improvements to the Ray-Ban Meta glasses and sunglasses include better integration with Apple Music, support for multimodal AI, and compatibility with WhatsApp and Messenger, allowing users to stream what they’re seeing from the sunglasses themselves.
Multimodal AI means the device’s AI assistant can now help you with information presented to it in multiple forms, simultaneously: images, audio, video, and text. This allows the glasses to, for example, respond to your voice command while processing an image in front of them, in real time.
Meta teased early access to multimodal AI support soon after launching the smart glasses, but this latest update is rolling out to all wearers of the glasses. Timed after the not-so-smooth Humane Ai Pin launch and with other devices like Rabbit’s R1 hovering on the horizon, Meta is bringing multimodal AI to a wearable device that doesn’t claim to do everything under the sun.
Instead, Meta is focusing the wearable on features wearers of smart glasses will already be used to, while adding new capabilities such as live video integrated into common messaging apps. The ability to share your view on WhatsApp and Messenger is completely hands-free, letting you show exactly what you’re seeing in real time. Not having to point a device at whatever you’re looking at definitely brings a new dimension to video calling.
The $329 smart glasses come with a built-in ultra-wide 12 megapixel (MP) camera that is integrated with Meta AI with Vision for augmented reality (AR) functionality. This includes a translation feature, which allows you to simply look at text in a foreign language and see it translated, as well as a landmark identification capability, announced earlier this year and demonstrated by Mark Zuckerberg himself in an Instagram post.
Also: Ray-Ban Meta smart glasses get hands-free Apple Music integration and more
Some practical uses of live view sharing could include showing someone product selections at a supermarket, while a more exciting application could be sharing your epic view on a hike or vacation. To share, just double tap the physical capture button on the glasses, which is possible even if you haven’t linked your WhatsApp account to the Meta View app.
Users can activate the camera to take photos and video with voice commands — like, “Hey Meta, send a photo” — instead of through a touchscreen or visual UI, and harness AI for a seamless, hands-free experience.
Also: 10 Threads features you should try in Meta’s Twitter alternative
Owners of the Ray-Ban Meta Smart Glasses only need to update their glasses in the Meta View app to access the new features. Meta clarified in a blog post that the new features are rolling out starting on April 24, so updates might not be immediately available for all users.
+ There are no comments
Add yours