ARVR

I didn’t think I’d like Snap’s new Spectacles this much


Key Takeaways

  • Snapchat’s new Spectacles smart glasses are standalone AR glasses meant for developers first.
  • Spectacles offer intuitive controls and high-resolution image quality in a lightweight design.
  • Snapchat partners like LEGO and Niantic are creating compelling experiences using Spectacles.



Snap has continually invested in its Spectacles smart glasses line. Announced at its recent Partner Summit, the fifth generation Spectacles are full-fledged see-through standalone AR glasses. I recently went to New York City to try them out and see if the company could get me one step further to becoming Tony Stark.

Previous Spectacles have been a pseudo-extension of the Snapchat platform. Much like Meta’s Ray-Ban glasses, Spectacles historically featured built-in cameras for recording video and snapping photos, allowing the wearer to upload the content directly to their Snapchat followers. However, a new emphasis on AR Lenses (Snapchat’s version of an app), the MyAI platform, and more, Snapchat has introduced a new pair of Spectacles. At launch, the new Spectacles are only available to developers willing to pay $99 per month (on a 12-month commitment). The idea is to get the new Spectacles in developers’ hands first so they can build a compelling app marketplace ahead of a future consumer launch.


After spending an hour with the Spectacles AR smart glasses, I continue think about them regularly. I find myself mulling over the potential these standalone glasses may have in the future — the experience really was that good.

Snapchat’s Spectacles are surprisingly light despite being standalone

Spectacles aren’t exactly subtle, but that’s expected for tech that’s still evolving

The fifth-generation Spectacles from Snapchat are built into a thick pair of black frames. They aren’t the most fashion-forward wearables on the market. However, I was surprised to see that their arms can fold, making them more compact when not in use. Weighing 0.49 lbs, the Spectacles are entirely see-through pair of AR glasses with no physical tethering needed.


Speaking with Sophia Dominguez, the Director of AR Platform at Snap, I was told that once the Spectacles are synced up with your smartphone and Snapchat account, you don’t need to carry your phone to use them. Everything is built into the glasses, leveraging the Spatial Engine and SnapOS. The new Spectacles have an estimated 45-minute battery life and the battery is built into the frame near the stems of the arms. While not providing an incredible amount of battery, 45 minutes is still impressive given where we are in AR technology.

Spectacles feature the company’s proprietary nano-scale waveguide technology, which powers their AR functionality. This allows users to see and interact with AR renderings via the LCoS projector. Spectacles don’t require much calibration or custom fitting, unlike other AR or mixed reality products. Users are urged to use the app to take measurements of their faces to calibrate the AR glasses’ lenses. Otherwise, using Spectacles doesn’t require you to measure the surrounding space, inside or out. Much of this is thanks to Snapchat’s partnership with Snapdragon and the glasses’ dual-system-on-a-chip (SoC) architecture.


The new pair of Spectacles feature 37 pixels per degree resolution, producing a shockingly high image quality under moderate lighting indoors. It also supports 2,000 nits of brightness. During my demo, I was urged to step out onto a balcony with the sun beaming down on my face. On this New York City terrace, I was surprised when the Spectacles automatically adjusted the lens dimmers, while maintaining a rich resolution. The smart glasses feature a 46-degree resolution. While experimenting with a handful of Lenses, Snap’s Spectacles made it feel as if I was interacting with a 100-inch display at times. All while continuing to provide a clear view of the real-world environment. With Spectacles being designed to be used in the real world, Snapchat ensures that privacy remains a focus for the company. As with earlier models of Spectacles, a small yet visible LED light blinks when content is being captured. A chime noise is also made when capturing content like a photo or video.


Spectacles feature intuitive controls across plenty of apps

Snapchat is partnering with Lego, Niantic, and ILM Immersive for Spectacles

Spectacles-5

During my one-hour demo, Dominguez and the Snapchat team walked me through several apps and experiences. For instance, there was one where, in real-time, Spectacles mapped the surrounding room, rendering a large school of fish from floor to ceiling. This demo highlights the accuracy of Spectacles’ mapping system, which incorporates a 13-millisecond motion to photon latency. I could walk up, put my hands out and an individual fish would react, scurrying away to a high degree of realism. Other demos enabled me to hold my hands out as if I were drawing on a canvas, causing flowers and plants to grow on surfaces around me. SnapOS and the Spatial Engine analyzed surfaces, changing the fauna. It all depended on whether I was “drawing” on the ceiling, a wall, a table, etc. The experience is fluid, functional, and most importantly, easy to wrap my head around.


These “experiences” were just the start. The real magic began when we saw how partnering developers were using Spectacles. Niantic, for instance, is looking at the AR smart glasses to broaden its ventures for Peridot. While Peridot is available on mobile, Niantic saw an opportunity to develop an AR Lens for its game on Spectacles. Within the Peridot Lens, a small pet from the game was spawned. Without any calibration from my end, I could direct where the pet would go with my hand and some simple hand gestures. It would walk around and jump onto a nearby table outside. I could even reach out and pet it with accuracy. Lego partnered with Snapchat to bring Bricktacular to Spectacles. In a short demo, the Lens experience can render an assortment of bricks onto a surface. Similar to real life, I began picking up the bricks piece by piece with my hands. I could spin and manipulate each one, placing one onto another to build a small house. Once I was out of bricks, I could use the built-in microphones to create new bricks, with specific colors using voice commands.


Of course, there were plenty of minor hiccups, like response time being slightly long or other errors.

Across all the Lenses I experienced, I was taken aback by how natural they all felt. Reaching out and interacting with these virtual assets, Snap and its partnering developers authentically captured the feeling of bringing the digital into the real world. It’s a bit of a parlor trick, but if something is wrong or it doesn’t feel natural, the experience is ruined. I’ve played around with several smart glasses and mixed-reality headsets. At best, many have a learning curve that you grow accustomed to over time. At worst, navigation and the UI are just flat-out tiresome to wrap your head around. Spectacles are pretty user-friendly. The bulk of its menu systems are quite literally in the palm of your hand.


Opening up my left palm made the menu system appear. By pinching as if I was popping a piece of bubble wrap, I brought up the Lens menu. Pinching on a tile, I would then launch the new experience. Even the moment-to-moment interactions were all intuitive and easy to understand. Of course, there were plenty of minor hiccups, like response time being slightly long or other errors. However, given that Spectacles are launching for developers first, I wasn’t all that let down.

Spectacles are launching for developers first as part of a subscription model

For $99/month, developers can begin playing with Lens Studio 5.0

Spectacles-1


While my time with Spectacles was a fairly positive experience, there’s no clear timeline of when the AR smart glasses will be in the hands of consumers. Snap is taking a pragmatic approach to a release. Rather than rushing the devices through the doors, Snap is enabling developers to play around with the smart glasses, Snap ML Engine, create Lenses and ML models. With the Spectacles Interaction Kit, developers can take advantage of cloud-hosted multimodal AI models thanks to a partnership with OpenAI. Developers can bring new models to their Spectacles experiences, providing further context to what users see, hear, and say.

We’ve only been working with a very small number of developers and partners to do that. I think we’ve gotten a cold start on the content experiences.


“Our mission is to become the most developer-friendly platform in the world,” Dominguez said. “Part of that is coming from our investments in Lens Studio over the last two years. When we rewrote all of Lens Studio to make sure that anyone who’s building for Spectacles could have a great developer experience. We’ve only been working with a very small number of developers and partners to do that. I think we’ve gotten a cold start on the content experiences.”

The hope is that the fifth-generation Spectacles make their way to the consumer market. Snap aims to look at how the market and potential consumers react to Spectacles and how its development program can suit users’ needs. Under the framework of Snapchat and its MyAI technology, Spectacles can be a supplemental experience to the 300 million people using AR Lenses on Snapchat. However, whether it’s entertainment, productivity, or contextual interactions using AI, Spectacles must have a compelling lineup of Lenses and experiences to be widely adopted. Although they’re not quite there yet, I’m eager to see what developers can do with this rather impressive pair of smart AR glasses.




READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.