This is Optimizer, a weekly newsletter sent every Friday from Verge senior reviewer Victoria Song that dissects and discusses the latest phones, smartwatches, apps, and other gizmos that swear they’re going to change your life. Optimizer arrives in our subscribers’ inboxes at 10AM ET. Opt in for Optimizer here.

There’s a hard conversation to be had about smart glasses in the coming weeks and months. At Meta Connect 2025, I got a first glimpse of the Meta Ray-Ban Display, the company’s first pair of smart glasses with a built-in monocular display. There’s no beating around the bush. The demos I got were nothing short of impressive. But something about having an invisible display and the ability to appear present while secretly doing something else under the table — is eerie. I’ll dive into the questions these glasses raise in the coming weeks, but today I want to focus on one way that Meta’s glasses are genuinely making life better: accessibility.

“For me, missing both my legs means that obviously walking is just a bit more difficult and more hazardous than other people,” Jon White, an inspirational speaker and Paralympic trainee who became a triple amputee after serving as a British Royal Marine, tells me in an interview at Meta’s headquarters ahead of the announcement. “Anything that means I’m not looking at my phone [so] I’ve got my head up, looking around me is much better.”

The Meta Ray-Ban Display glasses add live captioning, which will be a huge help to the hard of hearing.

White says that with only one arm, the ability to respond to messages without needing to pick up a phone in his remaining hand is crucial. Likewise, when White posts on Instagram about his engineering projects, the glasses’ camera allow-=s him to showcase his point of view without having to switch how his phone is positioned for the best angle. In our chat, White relays a story to me about how, when giving a speech, he was given a clicker for his slides and then handed a handheld mic. “I was like, ‘What do you want me to do with this?’”

And that’s just through one lens. Even some of the glasses’ features that seem far-fetched could be game changers for people with visual or hearing impairments. Take Meta’s Live AI feature. In my first impressions of the feature, I questioned the use of AI describing things you can already see. After publication, I was quickly served a piece of humble pie when several members of the low-vision and blind community reached out to tell me how these gadgets enabled them to live more independently. (I invited one to come share their experience on a recent Vergecast episode, which you can listen to here.) One anecdote that stuck with me was the ability to read menus in restaurants. Most eateries don’t carry Braille menus, and even if they did, it’s not a skill every visually impaired person has. Live AI on the glasses can read menu items aloud for visually impaired people, eliminating the need to rely on a sighted person.

It was also sobering to learn that, as a mass market product, the Meta glasses are significantly more affordable than similar gadgets created specifically for the visually impaired community. The glasses cost roughly $300-$400, whereas similar tools like OrCam readers can range from $1,990 to $4,250, with limited options for insurance.

Close up of the Meta Neural band

The neural band helps you control the Display glasses with gestures, freeing up your hand.

With the new Meta Ray-Ban Display glasses, I was also struck by a demo of a live captioning feature and how it might help folks who are deaf or hard-of-hearing. (Supposedly, it can provide translated subtitles in real time, too, but I’ll reserve judgment until I get to see that for myself.) Not only were the captions near instantaneous, but they were pretty accurate and unaffected by cross-talk. Because of the directional microphones, only the person you were directly looking at was captioned. And it’s not just the caption feature.

“I think all of that stuff is just going to make life easier for me,” White says, speaking about the new possibilities shown by the Display glasses and the neural band. “What I love is how [Meta has] kind of proven what you can do with the technology, and I know that will end up spreading into other industries, like prosthetics, and help take that forward.”

“I say we’re kind of limited by our own imaginations at the moment. One of the things that I’ve learned is that a lot of the things that I do in terms of adapting to being disabled would actually just make life really easier for able-bodied people,” White adds, speaking of his own experience.

White makes a salient point. Arguably, smart glasses as an accessibility device may be the best way for us to think about this tech. That’s because accessible design benefits everyone. For example, Apple’s double-tap and wrist-flick gestures initially started as Apple Watch accessibility features before becoming part of the main user interface. I may not be an amputee, but I find that these gestures have dramatically improved my experience with the watch. Headphone features that amplify voices so you can more easily hear conversations in crowded environments have their roots in accessibility, too.

The regular Ray-Ban Meta glasses have already been widely embraced by the low vision and blind communities.

It’s also encouraging that Meta announced it’s opening up its smart glasses to third-party developers so they can build new experiences using the glasses’ audio and visual features. HumanWare, an assistive tech company under EssilorLuxottica, will use this new software development kit to help blind and low-vision users navigate their environments. Microsoft is also working on an integration via the SDK for Seeing AI, its visual assistant for the blind community.

I don’t mean to discount the extremely valid concerns that people have about smart glasses. In the hours since Meta announced the Display glasses, the reaction online has been extremely polarized. Some people think this technology is inevitable. Others are screaming on social media that they’d rather yeet themselves to Jupiter than ever let Meta hardware touch their face. To me, these are understandable reactions given Meta’s reputation and the world we live in. I was equally impressed and terrified by my experience with the new Display glasses.

It’s vital, during this spaghetti era where smart glasses makers are still throwing ideas at the wall to figure out what works, to start having these difficult conversations before people start snatching smart glasses off people’s faces, as they did when Google Glass first launched. We should not “move fast and break things,” a tech philosophy credited to Meta CEO Mark Zuckerberg. Still, while we’re having these tough conversations, it’s important that we don’t drown out the people whose lives are being improved by this tech.

0 Comments

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.

Share.
Exit mobile version