I wore the new Snap Spectacles, and the AR glasses felt like the future

Estimated read time 5 min read


Snap Spectacles

Sabrina Ortiz/ZDNET

Although virtual reality (VR) headsets allow for immersive, unique experiences, the reality of a world where we walk around with a massive headset is clearly not the most feasible. Instead, the ability to have your glasses display augmented reality (AR) experiences that enhance your everyday life is a lot more compelling, and the Snap Spectacles are a step towards that. 

Also: Qualcomm’s new chipset that will power flagship Android phones makes the iPhone seem outdated

I had the opportunity to test out the Spectacles at Qualcomm’s Snapdragon Summit this week, and the experience was a fun, exciting glimpse into what everyday AR glasses could feel (but not so much look) like in the near future.

Wearing the glasses 

The fifth-generation Snap Spectacles are built to have the same form factor as regular glasses while packing in advanced hardware, such as four cameras, two in the front and two below for hand tracking; two Qualcomm Snapdragon processors, one on each side for computer vision and to run augmented reality experiences; stereo speakers; six microphone arrays; and a 37-pixel per degree stereo waveguide display for more sharp-looking visuals. 

Also: Meta’s new 512GB Quest 3 deal is one of the best VR deals right now

Naturally, this hardware combination means the Spectacles are much bulkier than regular glasses. However, considering how much tech is within, including a built-in battery, the Spectacles are reasonably compact. They’re also relatively comfortable to wear for limited periods, but after about 10 minutes, the headset’s weight felt much more noticeable to me. 

Snapchat Spectacles

Sabrina Ortiz/ZDNET

Unlike VR headsets like the Meta Quest 3 and Apple Vision Pro, which aren’t practically built for everyday outdoor use, the Snap Spectacles have what’s called a dynamic display brightness function, allowing you to seamlessly toggle on or off the digital overlays. When it was turned off, I could see the world comfortably with the prescription inserts that Snap had added before my demo. Notably, my view was not much different from when I wore my actual glasses, even when the augmented reality experience resumed.

User experience 

The glasses run on Snap OS, and the UI and navigation are fairly intuitive. The palm of your hand reveals a menu that requires taps to exit an application and return to it. A simple pinch motion controls most actions, and I got the hang of it within a minute.

With the headset, you can access various applications, including browsing the web, mirroring your phone, generating 3D objects, watching videos, playing games, and more. The most impressive part is being able to do all of that in such a compact form factor. (Obviously, it would be even better if the glasses looked and felt smaller.)

Also: I tried Meta’s Horizon Hyperscape demo: Welcome to the metaverse’s first holodeck

I demoed a series of applications, including a fingerpainting experience where I could draw in space by pinching my fingers, an application where I could generate any object I wanted by using my voice, and a game where I had to punch blocks, which you can watch in the video below.

The visuals of the experiences were surprisingly detailed, maintaining a crisp resolution and vivid coloring consistent with high-end AR headsets I have tried in the past. The system seamlessly integrated with the physical environment, accurately mapping and recognizing spatial elements in real time.

For instance, during an interactive experience where I could point to elements of my environment and have plants grow, the glasses’ spatial awareness was fairly accurate. I could point at distinct surfaces — such as walls, tables, and floors –and dynamically generate context-aware visuals. The system’s object recognition and environmental understanding ensured accurate visual placement and interaction across the 3D space.

The new Spectacles have a more ambitious goal that involves cross-device collaboration. For example, you can share a space with someone wearing glasses and work together. In my demo, I was able to generate and interact with objects with another user in the same room.

Availability 

The Snap Spectacles are currently only available to developers in the Spectacles Developer Program, allowing them to build, play, and test the glasses. The subscription for the Spectacles is $99 per month, with a 12-month commitment. 

As the glasses aren’t polished enough to go to market, the company’s biggest challenge is to find a way to pack all the technology into a smaller form factor. 

Also: 4 ways Android cameras are about to get better, thanks to Qualcomm – even for dogs

For example, Ray-Ban’s Meta Smart Glasses have become Ray-Ban’s best-selling product in 60% of its stores across Europe, Africa, and the Middle East, according to reports, which is likely because of the compact form factor, nearly identical to regular glasses. The Apple Vision Pro has had a different reception, with the tech giant reportedly cutting back on production due to weak sales. 

For Snap to attract users, the company will need to reduce the bulk of its glasses to provide more comfortable wear. However, by allowing developers to purchase the Spectacles, Snap can collect feedback to help progress the technology while getting a head start on competitors in the AR glasses market. 

Disclosure: The cost of Sabrina Ortiz’s travel to Maui, Hawaii, for the Snapdragon Summit was covered by Qualcomm, a common industry practice for long-distance trips. The judgments and opinions of ZDNET’s writers and editors are always independent of the companies we cover.





Source link

You May Also Like

More From Author

+ There are no comments

Add yours