Image: Foundry | Alex Walker-Todd
Snap, the company that has evolved from social media into AR glasses development, has unveiled Snap OS 2.0, the next evolution of its smart glasses software.
After getting hands-on experience at Snap’s London offices recently, I’m optimistic about the launch of their first consumer-ready smart glasses in eight years, planned for 2026.
The original Snap Spectacles from 2016 represented a pivotal transformation for Snap, traditionally a software-oriented company; it shifted the focus from hands to face, allowing users to capture and augment their surroundings before sharing them widely.
Following several updates to this concept, Snap pivoted towards a model aimed at developers, transforming Spectacles from mere cameras to self-sufficient computers worn on the face, equipped with cameras to perceive the environment and track 3D objects.
Last year’s revamped Spectacles introduced enhanced processing capabilities and marked the arrival of Snap OS. The company used the time since to refine core experiences with the help of developer partners.
My experience with Snap OS 2.0 allowed me to see the significant improvements from recent months, offering a clearer glimpse into what the 2026 Spectacles have to offer.
Below, I’ve highlighted three features I particularly enjoyed, along with one area where Snap still needs improvement ahead of the full release. I’ve also been hands-on with the latest Ray-Ban Meta (Gen 2) smart glasses.
Exceptional Connected Sessions
Snap has been clear about the major enhancements from Snap OS version 1.0 to 2.0 in the past year, but one of the standout experiences I tested was the Connected Sessions feature.
With the assistance of Snap’s Augmented Reality Engineer, Andreas Müller, I was able to join a Connected Session he was hosting using his Spectacles based on our physical proximity. Once connected, we could walk around and collaboratively paint 3D shapes in mid-air, experiencing impressively low latency in both hand-tracking and cross-device communication.
Although the demo’s idea was straightforward and I’ve experienced similar multiplayer VR sessions online, witnessing two AR users physically interacting in the same space while both pairs of Spectacles tracked their movements in real-time was genuinely more captivating.
The true potential lies in how this feature could elevate Snap’s AR capabilities. With support from the 400,000 developers in its network, Connected Sessions could revolutionize local collaborations—and multiplayer gaming—in ways we’ve only seen in tech demos before.
Enhanced Spotlight, Browser, and Gallery Features
Snap OS 2.0 introduces three significant features: Spotlight, Browser, and Gallery. Spotlight allows users to engage with vertical content just as they would in the Snapchat mobile app. Despite the interaction transitioning from a phone display to a virtual interface in 3D space, I found swipes, taps, gestures, and playback to be smooth and well-synced.
Snap announced WebXR support late in 2024, facilitating developers’ ability to create AR experiences that seamlessly integrate into the new Browser app. Users can type using a floating AR keyboard or utilize speech-to-text features for searches, with navigation being intuitive and gesture-based, similar to Spotlight.
The existing resolution and brightness of the Spectacles made it easy to read text on web pages during the bright demo session, and video playback, including content from YouTube, performed well—provided you’re okay with blacks appearing somewhat transparent.
Finally, there’s a streamlined way to review content captured through Snapchat or the Spectacles, presented in a chronological order that I could easily navigate. The support for stereoscopic content meant I could also enjoy 3D media.
Remarkable AI Capabilities in Sound and Vision
In addition to the immersive AR experiences, the latest AI features in the Spectacles offer insightful interactions with the surrounding environment.
I tested the Spatial Tips feature by focusing on a skateboard on a shelf and asking how to perform an ollie. The glasses then projected instructions on the board, correlating to the steps needed to achieve the trick.
Super Travel allowed me to hold up a menu from a Chinese restaurant and draw a bounding box with my hand, triggering a translation of the contained text into English, which was overlaid beside the physical menu—perfect for verifying whether I was about to order Chow Mein or silkworms.
Need for More Streamlined Spectacles
The form factor is a crucial consideration in the quest for top-tier face-worn AR devices. Snap’s choice to centralize processing in dual Snapdragon chips has made the Spectacles functional yet somewhat non-ergonomic.
Foundry | Alex Walker-Todd
While I found them reasonably comfortable, they appear awkward in proportion and style. The design sits unusually far from the face, particularly where the hinges meet the arms, coupled with bulbous ends likely housing the batteries.
This does not diminish the remarkable feat of miniaturizing so much technology into these glasses. Yet, if Snap aims to convert the remarkable experiences I’ve had into eyewear that users are willing to wear publicly, working on a more compact design is essential.
That said, the technological advancements within the device are certainly exciting.