Meta’s ‘Orion’ Glasses Offer Its First True AR Experience

Estimated read time 3 min read


As long as we’ve had VR headsets, we’ve been waiting for a company to crack the code on making those famed altered reality capabilities small and light enough to fit onto a pair of glasses. Meta claims it’s the first one to get there with “Orion,” a prototype pair of thick-rimmed AR glasses that can show content in front of your eyes and can be controlled with your voice, gestures, or—potentially—even your brain.

Meta CEO Mark Zuckerberg brought the shades out onstage at Meta Connect 2024, calling them “the first prototype of full holographic AR.” Orion projects its picture onto a waveguide-type screen housed on the left-hand lens. The battery is housed on the arm of the glasses, which is why they fold partway down the glasses rather than being hinged next to the lenses.

The glasses are akin to devices like those made by XReal, though these will use Meta’s in-house software and will ostensibly offer a range of capabilities from simple gaming to watching content in front of your eyes. At one point in the demo, Zuckerberg showed how you could present a “hologram” of another person in front of you like Zordon from Power Rangers suddenly appearing out of your dresser. 

The glasses include a 70-degree FOV through a micro-LED projector onto the glasses’ silicon carbide lens. There are speakers housed in a magnesium alloy frame. Orion includes a variety of sensors for eye and hand tracking, but the device also needs to be combined with an electromyography (EMG) wristband. This separate device includes several EMG sensors and an onboard processor for interpreting them. Essentially, the device is supposed to read subtle gestures of your hands and translate those into an input for your glasses.

While they’re relatively heavy at 98 grams, the glasses’ processing isn’t happening on-device. Instead, that’s being handled by a wireless “compute puck.” Meta said that while glasses handle the visuals, all the hand tracking, eye tracking, and AR algorithms are being run on the compute puck “to keep the glasses as lightweight and compact as possible.

The company shared more videos of other demos using AR capabilities. One involved using Orion’s vision capabilities combined with AI to read ingredients on a table. The glasses could then overlay information tags over each food product. Along with some of the same vision and audio features available on the Meta Ray-Bans, the AR glasses could include the ability to access social apps like Instagram within multiple windows displayed in front of you.

Meta CEO Mark Zuckerberg already spilled the beans that we’d see the company’s first true AR glasses. At its event, Meta showcased only a taste of what it’s working on. It’s not going to be available for a while, it seems. Zuck and Meta seem keen on keeping these glasses under wraps while they work to improve on the design. While we wait for a true consumer-end product, we’ll have to make due with the existing Quest headsets.

This is a developing story. We will update the piece as more information becomes available.



Source link

You May Also Like

More From Author

+ There are no comments

Add yours