Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    TikTok’s ‘ban’ problem could end soon with a new app and a sale

    July 6, 2025

    How to watch Summer Games Done Quick 2025

    July 6, 2025

    The Verge’s summer “in” and “out” list

    July 6, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Live translations on Meta’s smart glasses work well — until they don’t
    Reviews

    Live translations on Meta’s smart glasses work well — until they don’t

    News RoomBy News RoomJanuary 24, 20255 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    I was in middle school the last time I took a Spanish class. I remember enough for toddler talk — phrases like “Donde está el baño?” and “mi gato es muy gordo” — but having a meaningful conversation in Spanish without a translator is out of the question. So I was genuinely surprised the other day when, thanks to the Ray-Ban Meta smart glasses, I could have a mostly intelligible conversation with a Spanish speaker about K-pop.

    Live translations were added as part of a feature drop last month, alongside live AI and Shazam. It’s exactly what it sounds like. When you turn the feature on, you can have a conversation with a Spanish, French, or Italian speaker, and the glasses will translate what’s being said directly into your ears in real-time. You can also view a transcript of the conversation on your phone. Whatever you say in English will also be translated into the other language.

    Missing is the bit where we both start singing “APT APT APT!”
    Screenshot: Meta

    Full disclosure, my conversation was part of a Meta-facilitated demo. That’s not truly the same thing as plopping these glasses on, hopping down to Barcelona, and trying it in the wild. That said, I’m a translation tech skeptic and intended to find all the cracks where this tech could fail.

    The glasses were adept at translating a basic conversation about K-pop bands. After my conversation partner was done speaking, the translation would kick in soon after. This worked well if we talked in measured, medium-speed speech, with only a few sentences at a time. But that’s not how people actually speak. In real life, we launch into long-winded tirades, lose our train of thought, and talk much faster when angry or excited.

    To Meta’s credit, it considered the approach to some of these situations. I had my conversation partner speak at a faster speed and a longer duration. It handled the speed decently well, though there was understandably some lag in the real-time transcript. For longer speech, the glasses started translating mid-way through before my partner was done talking. That was a bit jarring and awkward, as you, the listener, have to recognize you’re a bit behind. The experience is similar to how live interpreters do it on international news or broadcasts.

    I was most impressed that the glasses could handle a bit of Spanglish. Often, multilingual speakers rarely stick to just one language, especially when in mixed-language company. In my family, we call it Konglish (Korean-English), and people slip in and out of each language, mixing and matching grammar that’s chaotic and functional. For example, my aunt will often speak several sentences in Korean, throw in two sentences in English, do another that’s a mix of Korean and English, and then revert to Korean. I had my conversation partner try something similar in Spanish and… the results were mixed.

    You can see the transcript start to struggle with slang while trying to rapidly switch between Spanish and English.
    Screenshot: Meta

    On the one hand, the glasses could handle short switches between languages. However, longer forays into English led to the AI repeating the English in my ear. Sometimes, it’d also repeat what I’d said, because it started getting confused. That got so distracting I couldn’t focus on what was being said.

    The glasses struggled with slang. Every language has its dialects, and each dialect can have its unique spin on colloquialisms. You need look no further than how American teens have subjected us all to phrases like skibidi and rizz. In this case, the glasses couldn’t accurately translate “no manches.” That translates to “no stain,” but in Mexican Spanish, it also means “no way” or “you’re kidding me!” The glasses chose the literal translation. In that vein, translation is an art. In some instances, the glasses got the correct gist across but failed to capture some nuances of what was being said to me. This is the burden of all translators — AI and human alike.

    You can’t use these to watch foreign-language movies or TV shows without subtitles. I watched a few clips of Emilia Pérez, and while it could accurately translate scenes where everyone was speaking loudly and clearly, it quit during a scene where characters were rapidly whispering to each other in hushed tones. Forget about the movie’s musical numbers entirely.

    You wouldn’t necessarily have these issues if you stuck to what Meta intended with this feature. It’s clear these glasses were mostly designed to help people have basic interactions while visiting other countries — things like asking for directions, ordering food at a restaurant, going to a museum, or completing a transaction. In those instances, you’re more likely to encounter people who speak slower with the understanding that you are not a native speaker.

    It’s a good start, but I still dream of the babel fish from Douglas Adams’ Hitchhiker’s Guide to the Galaxy — a little creature that when plopped in your ear, can instantly and accurately translate any language into your own. For now, that’s still the realm of science fiction.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleThe Ram 1500 Rev’s 500-mile battery option is reportedly canceled
    Next Article Mark Zuckerberg wants you to know he has a big AI data center, too

    Related Posts

    Nuki’s Smart Lock is a better retrofit door lock

    July 2, 2025

    Nothing Headphone 1 review: head-turning

    July 1, 2025

    The best Switch 2 screen protector you should buy

    June 30, 2025

    Apple CarPlay Ultra hands-on: more continuity, less disruption

    June 30, 2025

    The unbearable obviousness of AI fitness summaries

    June 29, 2025

    Lamborghini Revuelto review: perfect harmony

    June 28, 2025
    Our Picks

    How to watch Summer Games Done Quick 2025

    July 6, 2025

    The Verge’s summer “in” and “out” list

    July 6, 2025

    GM’s Cruise Cars Are Back on the Road in Three US States—But Not for Ride-Hailing

    July 6, 2025

    Lenovo Chromebook Plus 14 review: the new king of Chromebooks

    July 6, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    Gear

    How to Use Voice Typing on Your Phone

    By News RoomJuly 6, 2025

    Tap the small “i” icon on the left of the toolbar if you need more…

    How to Travel to the Most Remote Office on Earth

    July 6, 2025

    With RFK Jr. in Charge, Insurers Aren’t Saying If They’ll Cover Vaccines for Kids If Government Stops Recommending Them

    July 5, 2025

    I’m an Outdoor Writer. I’m Shopping These 55 Deals From REI’s 4th of July Sale

    July 5, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.