Tuesday, July 15
1752591547 hero

These Are the Biggest Rumors for the Next Generation of Meta Smart Glasses


We may earn a commission from links on this page.


As a devotee of Meta Ray-Ban smart glasses (seriously, I love the things), I’ve been squinting at every leak and offhand Zuckerberg comment to try to figure out what’s coming next—though not all developments are equal. The Meta Oakley smart glasses, which are currently available to preorder, will have a longer battery life and a better camera, but that’s more like a 1.5 upgrade than a next generation leap. So, let’s dive into the most intriguing leaks, educated guesses, and flat-out wishes for next-gen Meta smart glasses.

Meta’s going in two directions with its smart glasses: audio-focused glasses made in partnership with eyewear brands like Ray-Ban and Oakley, and the more cutting edge, augmented reality glasses. I’ve compiled rumors about both.

Let’s start with the big swing: Orion. Officially unveiled in September 2024, Orion is Meta’s prototype smart glasses platform aimed at combining AR and AI in a pair of comfortable-to-wear spectacles. The goal is to “bridge the physical and virtual worlds,” and if Meta can delivers on the promises of its demo videos, Orion (or something like it) would be a legitimate challenger to smart phones as a whole.

But that’s a huge “if.” Judging from the current cutting-edge of consumer AR smart glasses, there are major hurdles to overcome before anything like Orion is viable, affordable, and at a store near you. Meta has shown off the glasses to journalists, as you see in the video below, but there are no plans to release them in their current form:

Orion’s possibilities are obvious—picture needing to get to a gate in an airport and having a dotted line to follow, or designing something in 3D and crawling under it to get a look at the bottom—but the tech has some big shoes to fill. It’s meant to replace eyeglasses, technology so good, it’s been essentially unchanged since the 13th Century. After the “whoa, cool” factor wears off, would Orion’s benefits be worth the tech-hassles that come with it?

I wouldn’t wear Meta Ray-Bans if there was any effort involved in “operating” them: You charge them right from the case, and put ’em on and go. For something like Orion to be mass-accepted instead of a gadget-head novelty, I think it would need to be that easy to use. (Right now, Meta’s concept for interacting with the glasses involves a smart wristband you wear at all times.) Either way, we could be years away from “true” AR glasses being widely available, but Meta’s Hypernova smart glasses are right around the corner (supposedly).

There is (probably) a pair of Meta smart glasses with a display coming out soon. Meta is rumored to be releasing glasses with a built-in screen as early as the end of this year. Supposedly called “Hypernova,” these would do everything Ray-Ban Metas do, but also run apps and display photos on a small screen projected onto one of the lenses. They will supposedly come with a “neural” wristband controller for gesture control, much like the one shown in the Orion demoes. The supposed price: between $1,000 and $1,500.

Though not confirmed, this rumor seems plausible. Hypernova feel like a logical link between pie-in-the-sky concept glasses Orion and the Ray-Ban Meta glasses we already have. There’s really nothing preventing Meta from making these: Smart glasses with HUD type displays and HD virtual screens, like the XReal Pro, have been around for a few years. While those “replace your monitor” style AR glasses aren’t designed for everyday wear, all that’s keeping Meta from putting out glasses with a modest display in a daily loadout frame is the company’s business plan.

In most cases, I think a small HUD on a comfortable pair of glasses would be more useful and less hassle than something like Orion, in the same way sending a text is usually more useful and less hassle than making a Zoom call. A potential sticking point, though, is battery life. My main issue with existing Ray-Ban Metas is that they’re too heavy and the charge doesn’t last long enough. Adding the extra draw of a HUD seems like it could make both problems worse. If that’s solved, and they’re as easy-to-use as Ray-Ban Metas, I’d be first in line for a pair.


What do you think so far?

Let’s get away from the lofty, speculative, phone-less future, and “maybe it’ll happen” video glasses, and talk about where existing, audio and AI-based Meta smart glasses are likely to be going in the near future.

Last week, renders of the supposed next-gen Ray-Bans hit the web. While there isn’t any compelling reason to think these renders are legit—anyone can mock up a picture and call it a leak—the supposedly leaked features that go along with the renders probably are legit, but only because of how obvious they are. According to the report, the next generation of Meta smart glasses will “have significantly better battery life and enhanced AI features, including real-time object recognition and scene understanding,” which is like predicting the next Apple phone will have a better camera. Who would have seen it coming?

A more detailed and interesting rumor comes by way of tech site The Information. According to its sources, Meta is adding facial recognition into its upcoming generation of glasses. There’s nothing technologically stopping Meta from implementing facial recognition now. In fact, it was supposedly planned as a feature with the current generation of Meta glasses, but scrapped due to privacy concerns. It’s easy to understand why facial recognition would set off alarm bells for privacy advocates. But for others, including me, who aren’t as concerned with privacy but regularly forget the names of people they meet, you can imagine the appeal.

Speaking of dystopian-sounding features, Meta is said to be planning to include live monitoring and analysis of everything users are doing in its next line of glasses. The AI will stay on and just watch through your eyes, so Meta AI could say things like, “You parked in space 6G” or “You forgot to close the garage door.”

As a person with ADHD, I really want this. I have nagging doubts about the wisdom of offloading literally every intellectual task to a machine, and I’m not crazy about letting computers controlled by Mark Zuckerberg judge and exploit everything I do, but, the first time my glasses helped me find my lost car keys, all would be forgiven.





Source link