Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Apple ends support for Clips video-editing app

    October 11, 2025

    How The Verge and our readers manage kids’ screen time

    October 11, 2025

    The AirPods 4 and Lego’s brick-ified Grogu are our favorite deals this week

    October 11, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Live translations on Meta’s smart glasses work well — until they don’t
    Reviews

    Live translations on Meta’s smart glasses work well — until they don’t

    News RoomBy News RoomJanuary 24, 20255 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    I was in middle school the last time I took a Spanish class. I remember enough for toddler talk — phrases like “Donde está el baño?” and “mi gato es muy gordo” — but having a meaningful conversation in Spanish without a translator is out of the question. So I was genuinely surprised the other day when, thanks to the Ray-Ban Meta smart glasses, I could have a mostly intelligible conversation with a Spanish speaker about K-pop.

    Live translations were added as part of a feature drop last month, alongside live AI and Shazam. It’s exactly what it sounds like. When you turn the feature on, you can have a conversation with a Spanish, French, or Italian speaker, and the glasses will translate what’s being said directly into your ears in real-time. You can also view a transcript of the conversation on your phone. Whatever you say in English will also be translated into the other language.

    Missing is the bit where we both start singing “APT APT APT!”
    Screenshot: Meta

    Full disclosure, my conversation was part of a Meta-facilitated demo. That’s not truly the same thing as plopping these glasses on, hopping down to Barcelona, and trying it in the wild. That said, I’m a translation tech skeptic and intended to find all the cracks where this tech could fail.

    The glasses were adept at translating a basic conversation about K-pop bands. After my conversation partner was done speaking, the translation would kick in soon after. This worked well if we talked in measured, medium-speed speech, with only a few sentences at a time. But that’s not how people actually speak. In real life, we launch into long-winded tirades, lose our train of thought, and talk much faster when angry or excited.

    To Meta’s credit, it considered the approach to some of these situations. I had my conversation partner speak at a faster speed and a longer duration. It handled the speed decently well, though there was understandably some lag in the real-time transcript. For longer speech, the glasses started translating mid-way through before my partner was done talking. That was a bit jarring and awkward, as you, the listener, have to recognize you’re a bit behind. The experience is similar to how live interpreters do it on international news or broadcasts.

    I was most impressed that the glasses could handle a bit of Spanglish. Often, multilingual speakers rarely stick to just one language, especially when in mixed-language company. In my family, we call it Konglish (Korean-English), and people slip in and out of each language, mixing and matching grammar that’s chaotic and functional. For example, my aunt will often speak several sentences in Korean, throw in two sentences in English, do another that’s a mix of Korean and English, and then revert to Korean. I had my conversation partner try something similar in Spanish and… the results were mixed.

    You can see the transcript start to struggle with slang while trying to rapidly switch between Spanish and English.
    Screenshot: Meta

    On the one hand, the glasses could handle short switches between languages. However, longer forays into English led to the AI repeating the English in my ear. Sometimes, it’d also repeat what I’d said, because it started getting confused. That got so distracting I couldn’t focus on what was being said.

    The glasses struggled with slang. Every language has its dialects, and each dialect can have its unique spin on colloquialisms. You need look no further than how American teens have subjected us all to phrases like skibidi and rizz. In this case, the glasses couldn’t accurately translate “no manches.” That translates to “no stain,” but in Mexican Spanish, it also means “no way” or “you’re kidding me!” The glasses chose the literal translation. In that vein, translation is an art. In some instances, the glasses got the correct gist across but failed to capture some nuances of what was being said to me. This is the burden of all translators — AI and human alike.

    You can’t use these to watch foreign-language movies or TV shows without subtitles. I watched a few clips of Emilia Pérez, and while it could accurately translate scenes where everyone was speaking loudly and clearly, it quit during a scene where characters were rapidly whispering to each other in hushed tones. Forget about the movie’s musical numbers entirely.

    You wouldn’t necessarily have these issues if you stuck to what Meta intended with this feature. It’s clear these glasses were mostly designed to help people have basic interactions while visiting other countries — things like asking for directions, ordering food at a restaurant, going to a museum, or completing a transaction. In those instances, you’re more likely to encounter people who speak slower with the understanding that you are not a native speaker.

    It’s a good start, but I still dream of the babel fish from Douglas Adams’ Hitchhiker’s Guide to the Galaxy — a little creature that when plopped in your ear, can instantly and accurately translate any language into your own. For now, that’s still the realm of science fiction.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleThe Ram 1500 Rev’s 500-mile battery option is reportedly canceled
    Next Article Mark Zuckerberg wants you to know he has a big AI data center, too

    Related Posts

    Google Pixel 10 Pro Fold review: finally, a more durable foldable

    October 8, 2025

    The Google Pixel Watch 4 is the Android watch to beat

    October 8, 2025

    Google Pixel Buds 2A review: the right kind of compromise

    October 8, 2025

    While you were partying with your Steam Deck, GPD studied the cord

    October 6, 2025

    Hands-on with the new Google Home Speaker coming next year

    October 1, 2025

    Peloton ushers in a new era with revamped hardware and AI

    October 1, 2025
    Our Picks

    How The Verge and our readers manage kids’ screen time

    October 11, 2025

    The AirPods 4 and Lego’s brick-ified Grogu are our favorite deals this week

    October 11, 2025

    Is the Coros Nomad really an adventure watch?

    October 11, 2025

    Chaos, Confusion, and Conspiracies: Inside a Facebook Group for RFK Jr.’s Autism ‘Cure’

    October 11, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    Security

    How a Travel YouTuber Captured Nepal’s Revolution for the World

    By News RoomOctober 11, 2025

    When Harry Jackson pulled his small motorcycle into Kathmandu on September 8, he had no…

    You can now buy Microsoft’s Windows XP Crocs for $79.95

    October 10, 2025

    You can still get the latest AirPods Max at their Prime Day price

    October 10, 2025

    Bose is yanking key features from its SoundTouch speakers

    October 10, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.