Big Data

Where are my AR glasses?



META founder and CEO Mark Zuckerberg recently said that hundreds of millions of people might wear AR glasses. (He was speaking with Nvidia CEO Jensen Huang at this year’s SIGGRAPH conference.)

I have to say, I agree; I’ve made similar predictions in this space during the past couple of years. I think AR glasses —first without, then with, holographic images projected onto the lenses — will be the next big thing in consumer technology.

I also agree with Zuckerberg’s claimed approach to the category. During that same conversation, he said: “Let’s constrain the form factor to just something that looks great. And within that, let’s put in as much technology as we can.” 

That’s the opposite approach of most AR glasses makers. TCL RayNeo X2, Vuzix Ultralite, Rokid Max, XREAL Air and others start with: What’s the best visual experience we can ship within a reasonable price? They sacrifice appearance for quality imagery and lower price, but it’s a fatal sacrifice for mainstream acceptance. 

The result tends to be something that’s great to use but which nobody wants to be seen wearing outside. 

As Google learned with Google Glass, socially unacceptable glasses will never go mainstream. 

Ray-Ban Meta glasses, meanwhile, Meta’s only market success in hardware ever, follows the Zuckerberg model. (Zuckerberg claimed on a recent earnings called that Ray-Ban Meta “demand is still outpacing our ability to build them.”) The glasses look like normal glasses. And to make that work within a low price (starting at $300) there is no visual element in the lens. All output is audio. The camera can process multimodal input (photos, not video), but there is no light engine, no special lenses and no need for a bigger battery.

Still, Meta is clearly working on holographic visual AR glasses. The company is working on custom chips and partnering with Luxottica on getting the form factor right. Rumors circulating in Silicon Valley say Meta could publicly demonstrate AR glasses as early as October. 

Another interesting player is Brilliant Labs, which sells its Frame AI glasses. In theory, these sound fantastic. The glasses feature a microOLED display with a 20-degree diagonal field of view in the right eye. Frame accesses familiar generative AI chatbots like ChatGPT, Whisper and Perplexity. A forward-facing camera enables live translation, visual analysis and internet queries. The open-source design allows developers to customize and enhance the glasses’ functionality using provided tools and plugins. And they’re surprisingly inexpensive: $349 for standard lenses, $448 for prescription lenses.

Frames have clear downsides, as well. They lack built-in speakers and require connection to a smartphone for full functionality. Battery life ranges from two to six hours. But the biggest downside is their appearance. While they’re in the ballpark of ordinary looking glasses, the round frames stand out and draw attention in a bad way. While interesting for curious makers and tinkerers, the combination of poor battery life and dorky appearance make it clear that Frames glasses are not something an office professional could wear at work. 

Both startups and major tech companies are in a hot race to get to market with AR/AI glasses that look like regular glasses. That includes Apple, Google, Microsoft and dozens of other companies.

Which raises the questions: Where are the glasses? Why is it taking so long?

Components are too big, power-hungry and expensive

It’s possible right now to build great AR glasses. They would look like regular glasses, project holographic images anchored in physical space. And a camera would hoover up video for multi-modal input to advanced AI. That’s the good news. 

The bad news is that the battery would last maybe 45 minutes and they would cost oh, say, $10,000 a pair. 

I’m making those numbers up. The point is that we have the technology to create  great AI glasses, but need component shrinking, cost reductions and power efficiency on a whole new scale to make them viable in the market.

Huge strides have been made in the miniaturization of components, but more work remains. AR glasses need to fit all those electronic components into a regular-size frame. Even more difficult is keeping the weight down.

And while glasses must be made smaller and lighter, batteries must be bigger and more powerful. 

Even more challenging: Batteries need high energy density to provide sufficient power for the displays, processors and sensors in a compact form factor. Heat management is also an engineering challenge — the batteries can’t get hot because they’ll be right up against users’ temples. Companies are exploring advanced materials, like solid-state electrolytes and nano-materials. Big benefits could come from flexible and curved batteries for better integration into eyeglass frames. And technologies like solar cells or kinetic energy harvesting could help extend battery life. 

There are also qualitative hurdles to overcome. Light engines, which are the part of AR glasses that projects images onto lenses, tend to suffer from light leakage (where other people can see your screen and your glasses “light up” in low light), ghosting, rainbow artifacts, low resolution and more.

What’s interesting about the light engine component industry is that the major players — a group that includes Avegant, VitreaLab, Lumus and TriLite Technologies — are all working on the same problems, but with radically different approaches. For example, Avegant’s use of various display technologies and VCSEL contrasts with VitreaLab’s focus on quantum photonics and 3D waveguides. Lumus’s reflective waveguide technology differs from both, offering a unique method of image projection. TriLite’s laser beam scanning technology represents yet another distinct approach.

It will be interesting to see how these approaches shake out, and which approach offers the best combination of price, performance and size and weight.

So when do we all get all-day, everywhere AR glasses?

Following Zuckerberg’s maxim — “Let’s constrain the form factor to just something that looks great. And within that, let’s put in as much technology as we can” — we could see something creative from a major player soon.

As we learned with Ray-Ban Meta glasses, by making the right trade-offs, it’s possible to get a great, wearable product at low cost. The key now is adding a holographic display. 

One cost-cutting measure will be a display in one eye instead of two. Also: By offering visual elements sparingly, and mainly focusing on an audio interface, battery problems might be solved.

Another possibility — what if the display information showed only text and not pictures? I think most people would enjoy what might look like subtitles, offering language translation, contextual information, turn-by-turn directions and other information. Pictures and graphics can wait, if that improves battery life and cuts down on light engine problems like light leakage. 

Another shortcut is to offer just a heads-up display, rather than a display showing text and objects anchored in physical space — like Google Glass rather than Apple Vision Pro. 

And yet another point to consider is that AR glasses with holographic image capability don’t have to be as inexpensive as todays audio-only AI glasses. Ray-Ban Metas start at $300, but the right price for a great pair of AR glasses might be as much as $1,000.

The bottom line is that amazing AR glasses that look like ordinary eyeglasses are still coming. But truly high-quality, no-compromise devices won’t arrive anytime soon. It make take five years for more advanced developments in batteries, light engines, lenses and other components to be available at reasonable prices. 

In the meantime, the platform will benefit from creative trade-offs that provide something useful and appealing, though not perfect.

With the right combination components, persistent access to AI and glasses people really want to wear in public, Zuckerberg’s predictions about hundreds of millions of people wearing AR glasses might well turn out to be actually conservative.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.