Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Barry Diller Invented Prestige TV. Then He Conquered the Internet

    June 7, 2025

    At the Bitcoin Conference, the Republicans were for sale

    June 7, 2025

    A ban on state AI laws could smash Big Tech’s legal guardrails

    June 7, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Live translations on Meta’s smart glasses work well — until they don’t
    Reviews

    Live translations on Meta’s smart glasses work well — until they don’t

    News RoomBy News RoomJanuary 24, 20255 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    I was in middle school the last time I took a Spanish class. I remember enough for toddler talk — phrases like “Donde está el baño?” and “mi gato es muy gordo” — but having a meaningful conversation in Spanish without a translator is out of the question. So I was genuinely surprised the other day when, thanks to the Ray-Ban Meta smart glasses, I could have a mostly intelligible conversation with a Spanish speaker about K-pop.

    Live translations were added as part of a feature drop last month, alongside live AI and Shazam. It’s exactly what it sounds like. When you turn the feature on, you can have a conversation with a Spanish, French, or Italian speaker, and the glasses will translate what’s being said directly into your ears in real-time. You can also view a transcript of the conversation on your phone. Whatever you say in English will also be translated into the other language.

    Missing is the bit where we both start singing “APT APT APT!”
    Screenshot: Meta

    Full disclosure, my conversation was part of a Meta-facilitated demo. That’s not truly the same thing as plopping these glasses on, hopping down to Barcelona, and trying it in the wild. That said, I’m a translation tech skeptic and intended to find all the cracks where this tech could fail.

    The glasses were adept at translating a basic conversation about K-pop bands. After my conversation partner was done speaking, the translation would kick in soon after. This worked well if we talked in measured, medium-speed speech, with only a few sentences at a time. But that’s not how people actually speak. In real life, we launch into long-winded tirades, lose our train of thought, and talk much faster when angry or excited.

    To Meta’s credit, it considered the approach to some of these situations. I had my conversation partner speak at a faster speed and a longer duration. It handled the speed decently well, though there was understandably some lag in the real-time transcript. For longer speech, the glasses started translating mid-way through before my partner was done talking. That was a bit jarring and awkward, as you, the listener, have to recognize you’re a bit behind. The experience is similar to how live interpreters do it on international news or broadcasts.

    I was most impressed that the glasses could handle a bit of Spanglish. Often, multilingual speakers rarely stick to just one language, especially when in mixed-language company. In my family, we call it Konglish (Korean-English), and people slip in and out of each language, mixing and matching grammar that’s chaotic and functional. For example, my aunt will often speak several sentences in Korean, throw in two sentences in English, do another that’s a mix of Korean and English, and then revert to Korean. I had my conversation partner try something similar in Spanish and… the results were mixed.

    You can see the transcript start to struggle with slang while trying to rapidly switch between Spanish and English.
    Screenshot: Meta

    On the one hand, the glasses could handle short switches between languages. However, longer forays into English led to the AI repeating the English in my ear. Sometimes, it’d also repeat what I’d said, because it started getting confused. That got so distracting I couldn’t focus on what was being said.

    The glasses struggled with slang. Every language has its dialects, and each dialect can have its unique spin on colloquialisms. You need look no further than how American teens have subjected us all to phrases like skibidi and rizz. In this case, the glasses couldn’t accurately translate “no manches.” That translates to “no stain,” but in Mexican Spanish, it also means “no way” or “you’re kidding me!” The glasses chose the literal translation. In that vein, translation is an art. In some instances, the glasses got the correct gist across but failed to capture some nuances of what was being said to me. This is the burden of all translators — AI and human alike.

    You can’t use these to watch foreign-language movies or TV shows without subtitles. I watched a few clips of Emilia Pérez, and while it could accurately translate scenes where everyone was speaking loudly and clearly, it quit during a scene where characters were rapidly whispering to each other in hushed tones. Forget about the movie’s musical numbers entirely.

    You wouldn’t necessarily have these issues if you stuck to what Meta intended with this feature. It’s clear these glasses were mostly designed to help people have basic interactions while visiting other countries — things like asking for directions, ordering food at a restaurant, going to a museum, or completing a transaction. In those instances, you’re more likely to encounter people who speak slower with the understanding that you are not a native speaker.

    It’s a good start, but I still dream of the babel fish from Douglas Adams’ Hitchhiker’s Guide to the Galaxy — a little creature that when plopped in your ear, can instantly and accurately translate any language into your own. For now, that’s still the realm of science fiction.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleThe Ram 1500 Rev’s 500-mile battery option is reportedly canceled
    Next Article Mark Zuckerberg wants you to know he has a big AI data center, too

    Related Posts

    The Samsung Galaxy S25 Edge is a big phone with small phone energy

    June 6, 2025

    The cursed world of AI kiss and hug apps

    June 5, 2025

    Tested: Nvidia’s GeForce Now just breathed new life into my Steam Deck

    May 29, 2025

    We tried on Google’s prototype AI smart glasses

    May 20, 2025

    This smart lock never runs out of battery — because I shoot it with lasers

    May 16, 2025

    Sony WH-1000XM6 hands-on: back to the fold

    May 15, 2025
    Our Picks

    At the Bitcoin Conference, the Republicans were for sale

    June 7, 2025

    A ban on state AI laws could smash Big Tech’s legal guardrails

    June 7, 2025

    Everything You Need to Know About MicroSD Express

    June 7, 2025

    Apple’s latest AirPods Pro with USB-C just received a $70 discount

    June 7, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    Gear

    Samsung Teases Z Fold Ultra, Bing Gets AI Video, and Nothing Sets A Date—Your Gear News of the Week

    By News RoomJune 7, 2025

    We have a few details so far. The phone may not have the Glyph light…

    ‘Mario Kart World’ Devs Broke Their Own Rule on Who Gets to Drive

    June 7, 2025

    Apple is on defense at WWDC

    June 7, 2025

    Silicon Valley Is Starting to Pick Sides in Musk and Trump’s Breakup

    June 7, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.