Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    8BitDo’s new collection celebrates the NES’s 40th anniversary

    October 18, 2025

    TiVo won the court battles, but lost the TV war

    October 18, 2025

    Motorola’s Razr Ultra and the Marshall Emberton II top this week’s best deals

    October 18, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » The future I saw through the Meta Ray-Ban Display amazes and terrifies me
    News

    The future I saw through the Meta Ray-Ban Display amazes and terrifies me

    News RoomBy News RoomOctober 18, 202519 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    Outside a florist-cum-coffee shop in upstate New York, a row of vintage cars gleam in the sun. It’s unseasonably warm for early October, so there’s a veritable crowd of car enthusiasts snapping photos of Ferraris, Porsches, and a vintage Alfa Romeo. Patient girlfriends and wives roll their eyes, sipping on maple matcha lattes and eating pumpkin spice donuts.

    At my side, my right hand is twitching like I’m a wizard casting a spell. I’m hunched over, bending my head as I stare at a lime green Lamborghini, shouting, “WHAT MODEL CAR AM I LOOKING AT?” (The lot is quite loud, given that several car dads are revving like Dom Toretto might appear at any minute and demand a street race.) After a few moments, I move to the next car and yell the same question.

    $799

    The Good

    • Impressive hardware
    • Display opens many more possibilities
    • It’s genuinely cool to use
    • These are the best smart glasses on the market

    The Bad

    • Lacks robust third-party apps
    • Heavy and chunky
    • Battery dies fast
    • That recording light is still too subtle
    • The dead-eye stare
    • Display can only be used in the right lens
    • Limited Rx support
    • Meta’s privacy record isn’t great

    The car guys give me a wide berth. But unbeknownst to them, the chunky sunglasses I’m wearing aren’t a typical pair of Wayfarers. They’re the $799 Meta Ray-Ban Display. For my ears only, Meta AI incorrectly informs me that the Ferrari I’m looking at is a Chevrolet Corvette. Then it tells me my battery is low and that for Meta AI to work properly, I need a better internet connection. I import the photos and videos I’ve taken onto my phone. Right before I stuff the glasses back into their case, I watch a Reel of a funny cat video that my friend sent me in an Instagram DM.

    When the occasion calls for a display, these glasses feel magical. Then, after about five minutes, you’ll run into the quirks that remind you, “Oh, this is a first-gen device. And it’s made by Meta.”

    When it works, it’s magic

    If you’re expecting Tony Stark’s glasses, you’ll have to keep waiting. Meta’s new smart glasses have a tiny display in the right lens that you can use for basic tasks like glancing at a map or reading a text message. You won’t see fancy AR overlays like in a sci-fi flick — it’s more like if you took a smartwatch’s screen and plopped it in front of your face.

    The glasses pair with another piece of hardware, the Neural Band. It’s a wristband that lets you control the display with gestures like pinching your fingers, swiping with your thumb, or rotating your wrist. There’s no screen or health sensors here; it really is just a high-tech remote.

    The Neural Band is versatile, discreet, and enables the gesture controls.

    The Neural Band is versatile, discreet, and enables the gesture controls.

    These glasses do everything that the audio-only Ray-Ban Meta glasses do, but the display lets them introduce a variety of new features that you’d previously need to pull out your phone for. You can reply to texts, view Instagram Reels, frame photos and videos, caption or translate the conversations happening around you, and get walking directions while viewing a map of your surroundings. When you interact with Meta AI, you can now see informational cards.

    It took a while to figure out where the display fits into my life. With the original audio-only Ray-Bans, the use case was clear-cut. I pop them on when I go for walks or attend events like a concert, where I might want footage. But as an able-bodied, sighted person, I’ve never found the Meta AI features that useful in my day-to-day life. I might use the glasses to identify a flower or tree I see on a walk, but that’s as far as it went.

    In the case, and out. Very much like that the case folds flat.
    Photo by Amelia Holowaty Krales / The Verge and Photo by Amelia Holowaty Krales / The Verge

    A display opens more doors. One of the big features is live captions, which adds real-time subtitles to your conversations. It was helpful to turn on live captions during a podcast taping. They’re not always perfect — AI transcriptions universally struggle with slang or uncommon names — but they’re nice in one-on-one conversations in a noisy restaurant. The transcriptions are less useful when you’re walking and talking with a friend, though. For it to work well, you have to be looking directly at your conversation partner. That makes it awkward if you’re walking side by side, as you physically have to turn your head to face each speaker. The AI also had a hard time captioning my mumbly spouse. No amount of yelling in a loud bar can help the AI, either.

    Meta Ray-Ban Display and Neural Band Specs

    • Display: 600 x 600 pixels with 20-degree field of view, 90Hz refresh rate (30Hz for content), and 30–5,000 nits of brightness
    • Battery life: 6 hours of mixed use for glasses, 18 hours for Neural Band. The glasses case holds 4 extra charges.
    • Lenses: Transition lenses that support prescriptions from -4.00 to +4.00
    • Camera: 12MP with 3x zoom; 3024 x 4032 pixel photo resolution with 1080p at 30fps for video
    • Weight: 69g
    • Water resistance: IPX4 for glasses, IPX7 for Neural Band
    • Storage: 32GB of storage, capable of storing up to 1,000 photos and 100 30-second videos.

    Even so, it’s a scenario that feels magical when all the pieces fall into place. When I show the feature to my in-laws, their jaws drop. Immediately, their minds go to relatives who are hard of hearing who might benefit. (They are, of course, less keen when they hear the price.)

    Maps and live navigation are other features that elicit oohs and ahs. While I wish it were easier to search for specific addresses or destinations, it’s immensely satisfying not having to look down at your phone while walking to a nearby cafe or train stop. It’s one of those features that simply makes sense for this form factor.

    As for capturing photos and videos, I wish the Display had the better camera resolution of the second-gen Ray-Ban Meta glasses. That said, I’m irrationally happy that the display lets me watch as I zoom into my cat’s face when I take the 1,000th video of him being a silly little goober.

    The Neural Band is surprisingly capable at controlling these applications. It takes a hot second to remember all the gestures and use them fluently. But once everything clicks? It’s hard to go back to voice controls because it’s so seamless. Gesture-based controls are something all wearable gadgets should pursue.

    Texting on the glasses ties all of this together — and also starts to get at where using them becomes uncomfortable. It feels magical when, at dinner, I can hide my hand under the table, read a text, swipe my fingers, and reply to a message without anyone knowing. But the problem is that much of what makes texting (and photos, and videos, and so many other features enabled by the display) so impressive is that no one else can tell what you’re doing.

    The people around you think you’re fully present, but really, you’ve tricked them. Once you realize that, using the glasses feels less like sorcery and more like sleight of hand.

    First-gen quirks and irks

    When the magic wears off, you’re left with a first-generation device with quirks and irks. Early adopters will begrudgingly ignore them, but it makes this device a tough sell for the average person.

    Take the display. At first, it’s easy to be enamored with the optical engineering. It uses geometric waveguides like a mirror to bounce light into your eye at specific angles. That means you don’t see an obvious screen outline like with other glasses, and in typical use, no one is ever going to notice when you’re using it. Absorbing this technological feat, witnessing it in action, it’s easy to think, “This is the future!”

    But the display is also very limited. It’s small, fuzzy, and only visible from the right lens. It can reach a bright 5,000 nits, and the glasses are outfitted with transition lenses to help keep it visible, but it’s still not enough to make it usable on an extremely sunny day.

    1/3

    Here’s a simulated first-person view of walking directions from a live demo. Meta helped us with screen elements recorded on a demo Meta Ray-Ban Display.

    And while other people usually can’t see the screen, you’ll inevitably move your head a certain way, and an eagle-eyed friend will point out that they can see the waveguides etched into the lens. Crank up the brightness to the maximum 5,000 nits, stand in a shadow, angle a camera just so, and others can sometimes see the ghost of the display. Sure, shattering this illusion requires a certain kind of nerd, but it’s a reminder that this isn’t the final version. It’s a (very impressive) work in progress.

    That’s not the only quirk. The menu pops up in your right-hand peripheral vision. Constantly peering to the side made my eyes hurt if I used the screen too much. Multiple coworkers closed their left eye in demos to help with the strain. It’d probably be easier on the eyes if Meta opted to put displays in both lenses for a centered AR overlay with a wider field of view, but Meta already has a prototype that does all of that, and it costs $10,000 to build.

    It’s also likely why the Display only supports a limited range of prescriptions: -4.00 to +4.00. Great if you’re within that range. But, if like me, you have severe astigmatism and a strong prescription, you’ll have to wear contacts. Can’t wear contacts? You’re out of luck. Same goes if you have low-vision or blindness in the right eye, because Meta can’t swap the display to the left lens.

    Close up of geometric waveguides on the Meta Ray-Ban Display

    Angled correctly and you can see the geometric waveguide tech.

    At 69g, these are also too heavy for daily all-day wear. My normal glasses with very thick lenses are 31g. I was fine wearing these for a few hours, but discomfort crept in after that. A few times, I felt the telltale signs of a headache at the back of my head and nose bridge. The bottom of the frames also left indentations on my cheeks. I’m prone to dry eye, so needing to wear contacts with these every day has been deeply uncomfortable. Artificial tears help, but the combination of the weight, eye strain, and dry eye has been tough to navigate.

    Battery life is another issue. You can get through a full workday if you keep the display or headphone usage to a minimum. But I burned through the battery in about 3.5 to four hours while testing photo and video, live captioning, walking directions, texting, and audio playback. Charging is relatively quick, but what are you supposed to do if you need these to see? Carry backup glasses at all times?

    Adding another layer of complexity is the Neural Band. It’s more comfy than the ring-like loop controls I’ve used on other smart glasses, but it takes up a lot of wrist space for a single-function wearable. I didn’t have many problems with it recognizing gestures, except that it occasionally summons Meta AI when I’m typing. Irksome, but the more annoying part was having to keep track of battery levels for both the glasses and the Neural Band — and needing to keep track of two proprietary chargers to keep them powered.

    Side profile view of the Meta Ray-Ban Display on Senior reviewer Victoria Song

    They’re a bold look and quite chunky at that.

    Credit where credit is due: that these look like a pair of oversized Ray-Bans is incredible. But it’s not a design that looks flattering on everyone. I think I look decent, but my spouse went off on a 30-minute TED Talk about how every other pair of glasses I own is infinitely more flattering. Meta’s positioning these as bold frames, but the truth is not everyone feels comfortable channeling their inner Iris Apfel. Meta offers several frames and colors for the audio Ray-Bans. Here, you get one style and two colors, take it or leave it.

    The software can also be frustrating. At launch, there’s no app store, and third-party integrations are limited. Meta has shoved its apps front and center. You can use WhatsApp or Messenger to view and respond to messages, as well as make video calls. Great if your social group uses WhatsApp; pointless if they don’t. And in the US, where neither of these is the standard, a lot of people won’t find this very useful.

    Instagram is on the glasses, too, except you can’t view your feed or scroll through Reels. You can only view DMs, and if a friend happens to have sent you a Reel there, then you can view it. Other than those three apps, there’s a photo app, camera app, maps app (but not Google Maps), Live Captioning, and a minigame for practicing scrolling and swiping gestures.

    To see the display like this, you need to be in a shaded area…

    Crank brightness to 5,000 nits…

    …and have an amazingly patient photographer angle their camera just so.

    But what if you could have your favorite notes app or a teleprompter for presentations? That’s such an obvious use case that its omission feels baffling. It’d be great if I could start my podcasts or audiobooks directly from the glasses, but alas, I use Libby and Pocket Casts. I can listen to them, but only if I start it first from my phone. (You’re more in luck if you use Spotify, as that does have voice control integration, though no app.) At least I can view my texts and take audio calls, but it’d be nice if FaceTime worked, too. Of course, that would require Apple and Meta to play nice with each other for once.

    Perhaps the most egregious miss is the lack of a browser. A good 80 percent of my texts are links to articles, TikToks, or posts on non-Meta social platforms. So while I can see that a friend sent me something, I can’t view it without whipping out my phone. But that seems to defeat the whole point of the glasses, which is to help me rely less on my phone. Instead, I often found myself backed into a corner where Meta AI will tell me I need my phone to accomplish something. This could be solved if there were more native apps, but it’ll take a while before there’s a robust third-party app market for such a niche device — if one ever arrives.

    Until then, I’m stuck fishing out my phone more than I’d like because a good chunk of my friends, family, and acquaintances have zero desire to dive deeper into the Meta ecosystem.

    The most common reaction I’ve gotten both online and in person regarding the Meta Ray-Ban Display glasses is, “Those are actually cool. Too bad Meta makes them.”

    I can’t blame them for feeling this way. Most recently, Meta raised alarm bells when it removed the option in its privacy policy to disable voice recordings from being stored on the cloud. People are just now getting Cambridge Analytica settlements. CEO Mark Zuckerberg hasn’t endeared himself as a paragon of trust, either. In a recent earnings call, he said that in the future, people without smart glasses will be at a “pretty significant cognitive disadvantage.” Not only is that statement distasteful, the more time I spend with these — and other AI wearables — the more that sentiment feels short-sighted.

    Close up of Meta Ray-Ban Display with a mosaic of TV static behind it.

    Can you notice the LED recording light is on?

    Earlier this month, the University of San Francisco warned that a man wearing Meta glasses was reported to be filming female students. A woman went viral on TikTok because she was freaked out upon noticing her esthetician was wearing the glasses during a Brazilian waxing appointment. A Border Patrol agent was spotted wearing the glasses at an immigration raid. A year ago, two college students rigged the glasses to dox strangers using facial recognition software.

    Meta’s only response for these scenarios is an etiquette guide that boils down to “Hey, here’s an easily ignored recording indicator light” and “Don’t be a jerk!” These possibilities remind me of the launch of AirTags. Most people who bought them didn’t use them for nefarious purposes. But enough people did, and will continue to do so, that Apple had to strengthen protections against malicious tracking.

    There’s nothing for me to point to when readers ask me what Meta’s done to proactively assuage privacy concerns. I haven’t heard of someone publicly snatching these glasses off someone’s face yet, probably because these are incredibly discreet compared to Google Glass. But I can’t help but feel it’s a matter of time. Before that happens, Meta has the opportunity to be a leader in this discussion and take a more active hand in shaping protective features. So far, it has not risen to the occasion.

    It’s a shame, too, because of the many conversations I’ve had in recent months with people in disabled communities who are excited for this technology from an accessibility standpoint. That can’t be ignored, either. But while some in this community are willing to overlook Meta’s reputation in exchange for life-changing tech, that’s not a compromise they should have to make.

    Fit check for the glasshole era

    Culturally, these glasses are also opening a door that we may not be able to close. How I feel when I’ve used these glasses is one thing. Watching other people use them when I’m not wearing a pair is another.

    A friend was unable to book a demo, so I offered to give one over lunch. The entire time, we had an engaged conversation about their experience. Except, I was staring at someone who was, at best, looking through me. We rarely had eye contact, and when the glasses were returned to me, I saw the photos of me I’d failed to notice they’d taken. As a joke, I took a picture of their dead-eyed stare. They laughed, but we both agreed the experience was cursed.

    Another time, I was trying to get footage of myself testing the zoom at a florist. I didn’t notice when the cashier asked if I needed help. All she saw was a potential customer staring blankly at some floral arrangements. When I realized she was talking to me, I had to play it off, but I’ll never forget the weirded-out look she gave me while I fidgeted frantically to stop the video and dismiss the display.

    But imagine these first-gen kinks are smoothed over. Imagine a world where you can fully watch the game while at dinner with your annoying in-laws. Or where your company can ping you with Slack messages that get zoomed straight into your eyeball. Where, on a first date, your prospective beau is real-time swiping through other matches on Hinge, all while seeming to be present. When some future glasses-wearing politician gets up onstage during a presidential debate, will you be confident they’re answering the question? Or will you wonder if it’s the AI whispering in their ear?

    Close up of Senior Reviewer Victoria Song wearing the Meta Ray-Ban Display

    The dead-eye stare is hard to get used to.

    Is that the future we want from these devices?

    We are not there yet. These glasses are not capable of that. And yet, when I wear them, for the first time I can easily glimpse that reality coming true. Given that Google and Samsung are actively jumping into the fray, and that Apple is rumored to be next, this future seems to be coming fast. And it’s clear to me that we have barely begun to deal with the implications of that.
    A new chapter in the smart glasses era has begun, and I genuinely have no idea how it ends. What I do know is that this tech has the potential to be amazing and dystopian. If we want it to be the former, we can’t zero in on the cool technology without having a sober conversation about how this tech can — and will — reshape our culture.

    Agree to Continue: Meta Ray-Ban Display

    Every smart device now requires you to agree to a series of terms and conditions before you can use it — contracts that no one actually reads. It’s impossible for us to read and analyze every single one of these agreements. But we started counting exactly how many times you have to hit “agree” to use devices when we review them since these are agreements most people don’t read and definitely can’t negotiate.

    To use the Meta Ray-Ban Display smart glasses, you’ll need a Meta account and the Meta AI app downloaded onto your phone. A Meta account works across platforms like Meta, Instagram, and Quest and comes with its own Terms of Service and Privacy Policy. Should you decide to integrate with services like WhatsApp, Messenger, Instagram, Apple Music, Amazon Music, and Spotify, you also agree to those terms and privacy policies. You may also be asked to give permissions related to Bluetooth, Wi-Fi, location services, and voice data. If you choose to get a pair of prescription lenses, you may also be asked to share that information with Ray-Ban and/or Lenscrafters.

    The smart glasses also come with supplemental terms of service and privacy policies / notices, including:

    Final Tally: Two mandatory agreements, eight supplemental agreements and notices, and several optional agreements.

    Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

    • Victoria Song

      Victoria Song

      Victoria Song

      Senior Reviewer, Wearable Tech

      Posts from this author will be added to your daily email digest and your homepage feed.

      See All by Victoria Song

    • AR

      Posts from this topic will be added to your daily email digest and your homepage feed.

      See All AR

    • Gadgets

      Posts from this topic will be added to your daily email digest and your homepage feed.

      See All Gadgets

    • Meta

      Posts from this topic will be added to your daily email digest and your homepage feed.

      See All Meta

    • Reviews

      Posts from this topic will be added to your daily email digest and your homepage feed.

      See All Reviews

    • Tech

      Posts from this topic will be added to your daily email digest and your homepage feed.

      See All Tech

    • Wearable

      Posts from this topic will be added to your daily email digest and your homepage feed.

      See All Wearable

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleDon’t Fall for Sketchy iPhone VPNs—Here Are the Only 3 You Should Use
    Next Article Motorola’s Razr Ultra and the Marshall Emberton II top this week’s best deals

    Related Posts

    8BitDo’s new collection celebrates the NES’s 40th anniversary

    October 18, 2025

    TiVo won the court battles, but lost the TV war

    October 18, 2025

    Motorola’s Razr Ultra and the Marshall Emberton II top this week’s best deals

    October 18, 2025

    Facebook’s new button lets its AI look at photos you haven’t uploaded yet

    October 17, 2025

    AI can’t even turn on the lights

    October 17, 2025

    Pokémon Legends: Z-A Rotom Phone review: better camera, higher jumps

    October 17, 2025
    Our Picks

    TiVo won the court battles, but lost the TV war

    October 18, 2025

    Motorola’s Razr Ultra and the Marshall Emberton II top this week’s best deals

    October 18, 2025

    The future I saw through the Meta Ray-Ban Display amazes and terrifies me

    October 18, 2025

    Don’t Fall for Sketchy iPhone VPNs—Here Are the Only 3 You Should Use

    October 18, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    News

    Facebook’s new button lets its AI look at photos you haven’t uploaded yet

    By News RoomOctober 17, 2025

    Meta has rolled out an opt-in AI feature to its US and Canadian Facebook users…

    AI can’t even turn on the lights

    October 17, 2025

    Pokémon Legends: Z-A Rotom Phone review: better camera, higher jumps

    October 17, 2025

    Amazon’s Ring now works with video surveillance company Flock

    October 17, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.