Back to resources

Meta Ray-Ban Display Smart Glasses: Yay Or Nay? | In The Loop Episode 32

Meta Ray-Ban Display Smart Glasses: Yay Or Nay? | In The Loop Episode 32

Published by

Jack Houghton
Anna Kocsis

Published on

September 25, 2025
September 25, 2025

Read time

7
min read

Category

Podcast
Table of contents

Imagine a world where you don't need a mobile phone because information just appears contextually in front of you when you need it. Language is automatically translated as you speak to someone, appearing as subtitles. As you navigate cities, arrows appear on the street directing you to your destination.

In a world like this, technology becomes ambient—always there, always present, but rarely noticeable. Last week, Meta released the Ray-Ban Meta glasses. The demo itself flopped on stage, but regardless of the failed attempts, the glasses represent a potential glimpse into the future of human-machine interaction.

In today’s episode, I’ll explore whether these glasses are just another financial misstep or a preview of what’s next. This is In The Loop with Jack Houghton. I hope you enjoy the show.

Meta Connect and the new Ray-Ban Meta Display

Let’s start with what happened in last week’s announcements. On September 17, at Meta Connect—the company’s biggest event of the year—Mark Zuckerberg took the stage to reveal what he believes is the ideal form of wearable intelligence.

They announced the new Ray-Ban glasses, starting at about $800 and launching at the end of this month. Every pair comes with Meta’s Neural Band, an innovative piece of technology they developed that uses electromyography to read electrical signals in your hand. Small movements of your fingers or wrist can trigger actions in the glasses.

On paper, the product is impressive. It has a full-color, high-resolution HUD (heads-up display) positioned just off the side of your field of vision, covering about a 20-degree field of view.

The display produces about 5,000 nits of brightness. For perspective, most smartphone screens max out at around 800–1,200 nits, and even the brightest tablets rarely exceed 1,600. This means you can see the display clearly in direct sunlight, which is crucial for glasses and something that has plagued earlier smart glasses projects.

It also has a 12-megapixel camera capable of recording at 1080p, microphone technology called beamforming (which focuses on sound from in front of you), open-ear speakers, and weighs about 69 grams—the weight of a chicken egg. That’s still 17 grams heavier than the previous generation, which didn’t have a screen.

Despite the specs, the live demo flopped. Two high-profile attempts to showcase the most innovative features didn’t work. The first was a cooking influencer who tried to use the glasses for recipe help. He said, “Hey Meta, start Live AI,” which should have triggered step-by-step instructions on screen. Instead, the glasses lagged for a few seconds and then skipped ahead in the recipe. He had to stop mid-demo and blame the Wi-Fi.

The second failure came from Zuckerberg himself. He tried to take a WhatsApp call during the event with the CTO, but it didn’t connect. Zuckerberg fumbled on stage, pressing buttons to get it working, before eventually giving up. These things happen—even with flawless rehearsals, demos can break in front of a large audience. Within 48 hours, Meta released an update to fix the issues, and the problems were relatively minor.

Still, it was a painfully awkward moment. With that aside, let’s dig into what the glasses are actually capable of.

What do Meta Glasses do?

I want to highlight some of the most interesting use cases for these glasses. While I wouldn’t buy them for just one of these features, several capabilities could have a real impact. From live translation to content creation, these glasses are starting to show why this category might matter.

Live Captions and translation

One of the most compelling features is live captions and translation. The glasses can listen to someone speaking and display captions in real time.

Even more impressively, they can translate between languages on the fly. Imagine traveling to Spain—or anywhere else—and speaking naturally while subtitles appear right in front of you.

For accessibility, this is huge: someone hard of hearing could read conversations, while someone with low vision could have the AI read them aloud.

Early testers report that it works well in quiet environments but struggles with background noise, occasionally picking up stray words from nearby conversations. It’s similar to digital note-taking apps during video calls, which sometimes misattribute speakers. Despite these limitations, the potential is significant.

Navigation and maps

Another exciting use case is navigation. The idea of seeing arrows or directions in your peripheral vision instead of staring at your phone is compelling.

Currently, the glasses provide turn-by-turn walking directions with arrows and mini maps positioned in the corner of your vision. The display is bright enough—5,000 nits, four times brighter than a phone screen—to be visible in direct sunlight. Meta has also implemented safety measures: the system disables itself when moving at high speed, like biking or driving.

The main limitation is that the glasses don’t yet support Google Maps or other established providers, which leaves them behind in terms of real-time traffic and mapping features. Still, the foundation is promising.

Message Triaging and Communication

The third use case is handling messages and notifications. Using the Neural Band, incoming texts, calls, and app notifications appear in your lens. You can respond without pulling out your phone, using subtle finger or wrist movements to interact.

Tech reviewer Marques Brownlee called it surprisingly effective. For many users, it feels genuinely useful—even magical. Personally, I wouldn’t allow notifications in my field of view, as I keep everything off to avoid distraction. But for others, this could be a major draw.

Content Creation

The fourth major use case is content creation. Meta’s ecosystem is creator-focused, and these glasses support filming and livestreaming directly to Meta platforms.

The 12-megapixel camera records at 1080p, and you can see exactly what’s being captured. You can zoom by rotating your wrist, start recording with finger gestures, and an LED activates to indicate recording is happening. While privacy concerns remain—especially in public spaces—these features are powerful for creators. Users can shoot reels, livestream events, or capture content instantly, without needing a separate device.

These are the main use cases that show why smart glasses might matter. The question remains whether these features will drive adoption or if the category will remain niche, but the potential is clear.

Will Meta glasses become widely adopted?

I’m going to attempt to answer whether the Meta Ray-Ban Display glasses will succeed and give my viewpoint.

When new technology comes out, it’s often seen as fanciful, and many people don’t think it will catch on. The smartphone evolution offers a clear roadmap for understanding whether smart glasses could cross the chasm—from early adopters with niche interests to the mainstream.

In the early 2000s, mobile phones like Palm Treo devices and early Nokias were popular mainly with business users and tech enthusiasts. They had clunky interfaces, large keyboards, poor battery life, and the general public saw them as expensive tools for a niche audience. The inflection point came around 2007–2010 with the iPhone and Android devices, which combined faster chips, improved hardware, and touchscreens. At the time, the BlackBerry CEO famously said no one would ever want to use anything other than a keyboard.

Right now, smart glasses feel like that BlackBerry era.

Meta sold about two million Ray-Ban Meta glasses in 2023. That’s respectable for a new category, but small compared to the smartphone market. Zuckerberg referenced the “third-generation rule,” where breakthrough consumer electronics usually hit mass adoption on their third iteration. Meta’s first glasses were in 2021, the Meta Glasses in 2023, and now the Ray-Ban Meta Display in 2025. This could be their iPhone 3G moment—or we could still be in 2000, with social and hardware infrastructure not yet ready for mainstream adoption.

With that context, here are the factors that could make or break smart glasses.

Price and market position

The first factor is price. At around $800, the glasses are expensive for mainstream consumers but reasonable for early adopters. For comparison, the first iPhone launched at $600 in 2007, roughly $850 today after inflation, positioning it as an early-adopter device.

Verizon has confirmed it will stock these glasses. If carriers bundle them into contracts—like they did with early mobile phones—that could dramatically expand the potential market. By contrast, Apple’s Vision Pro sits at $3,500, pricing out most users. Meta’s pricing feels much more accessible for early adopters.

Input mechanism

Smartphone breakthroughs were driven by interaction methods: T9 keyboards, BlackBerry keyboards, and finally touchscreens, each unlocking new use cases.

Meta’s Neural Band may be the equivalent of that touchscreen moment for smart glasses. After four years of development, it allows users to navigate interfaces with subtle wrist or finger movements. Features like zooming a camera by rotating your wrist feel intuitive and even “magical” according to reviewers. Seamless input is critical for adoption.

The app ecosystem

The iPhone’s success was powered by the App Store. Meta’s glasses, at launch, have almost no third-party ecosystem, which limits developer innovation.

Meta is aware of this and is building partnerships with companies like Microsoft and Disney. The goal is to attract developers quickly to expand use cases before competitors enter the space.

Social acceptance and privacy

Early mobile phones were often seen as rude in public. Smart glasses face a similar challenge.

Privacy concerns are critical. Google Glass failed in part because people didn’t want to be unknowingly recorded. Meta is addressing this by styling the glasses like Ray-Bans for fashion legitimacy and adding a bright LED to signal recording. Still, restrictions in schools, gyms, offices, and hospitals could limit usage and slow adoption.

Competition and timing

The fifth factor is competition. Apple is likely developing a competing product, and its ecosystem advantage is significant. Even if Meta gains traction, Apple could launch a polished product integrated into the iPhone ecosystem.

Meta aims to sell 10 million units by 2026. If they succeed, they could establish themselves as a serious player before Apple enters the market.

Stay In The Loop

Closing thoughts

Meta’s glasses represent the dawn of a new era in computing. Being first to market often means being early—and sometimes being wrong on timing.

The technology largely works, the use cases make sense, and social barriers are probably lower than ever, especially with the Ray-Ban partnership. But Meta faces significant challenges: building a large developer ecosystem, attracting major partners, and ultimately answering the big question—do people really want computers on their faces?

Right now, I’d place these glasses in the “BlackBerry era”: functional enough for early adopters to find utility, but still years away from an iPhone-level moment that drives mass adoption. The key question is whether Meta can iterate fast enough to get there, or if they’re mostly building brand awareness—a bridge for Apple to walk over.

My verdict is cautiously optimistic. These glasses might not change everything immediately, but they could demonstrate that change is inevitable. And sometimes, being early to an inevitable change isn’t a disadvantage—it’s a preview.

Thank you for listening. I hope you found today’s episode interesting, and I’ll see you next week.

Become an AI expert

Subscribe to our newsletter for the latest AI news.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Table of contents
Share this:
Related

Articles

Stay tuned for the latest AI thought leadership.

View all

Book a demo today.