Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    What’s Inside the Tiny Miracle Food Pouches That Can Save the Lives of Starving Gazans

    August 5, 2025

    Online shopping is full of copycats

    August 5, 2025

    Amazon is bringing its Starlink alternative to Australia next year

    August 5, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Hospitals use a transcription tool powered by a hallucination-prone OpenAI model
    News

    Hospitals use a transcription tool powered by a hallucination-prone OpenAI model

    News RoomBy News RoomOctober 27, 20242 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    A few months ago, my doctor showed off an AI transcription tool he used to record and summarize his patient meetings. In my case, the summary was fine, but researchers cited by ABC News have found that’s not always the case with OpenAI’s Whisper, which powers a tool many hospitals use — sometimes it just makes things up entirely.

    Whisper is used by a company called Nabla for a medical transcription tool that it estimates has transcribed 7 million medical conversations, according to ABC News. More than 30,000 clinicians and 40 health systems use it, the outlet writes. Nabla is reportedly aware that Whisper can hallucinate, and is “addressing the problem.”

    A group of researchers from Cornell University, the University of Washington, and others found in a study that Whisper hallucinated in about 1 percent of transcriptions, making up entire sentences with sometimes violent sentiments or nonsensical phrases during silences in recordings. The researchers, who gathered audio samples from TalkBank’s AphasiaBank as part of the study, note silence is particularly common when someone with a language disorder called aphasia is speaking.

    One of the researchers, Allison Koenecke of Cornel University, posted examples like the one below in a thread about the study.

    The researchers found that hallucinations also included invented medical conditions or phrases you might expect from a YouTube video, such as “Thank you for watching!” (OpenAI reportedly used to transcribe over a million hours of YouTube videos to train GPT-4.)

    The study was presented in June at the Association for Computing Machinery FAccT conference in Brazil. It’s not clear if it has been peer-reviewed.

    OpenAI spokesperson Taya Christianson emailed a statement to The Verge:

    We take this issue seriously and are continually working to improve, including reducing hallucinations. For Whisper use on our API platform, our usage policies prohibit use in certain high-stakes decision-making contexts, and our model card for open-source use includes recommendations against use in high-risk domains. We thank researchers for sharing their findings.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleInstagram saves the best video quality for the most popular content
    Next Article All the news from Apple’s ‘week’ of Mac announcements

    Related Posts

    Online shopping is full of copycats

    August 5, 2025

    Amazon is bringing its Starlink alternative to Australia next year

    August 5, 2025

    xAI’s new Grok image and video generator has a ‘spicy’ mode

    August 5, 2025

    Amazon pulls the plug on Sengled’s Alexa skill after months of outages

    August 4, 2025

    Amazon’s best Kindles are cheaper than ever at Best Buy

    August 4, 2025

    Amazon is gutting its Wondery podcast studio

    August 4, 2025
    Our Picks

    Online shopping is full of copycats

    August 5, 2025

    Amazon is bringing its Starlink alternative to Australia next year

    August 5, 2025

    Meet Ultra Skelly, the High-Tech Version of Home Depot’s Viral Skeleton

    August 5, 2025

    xAI’s new Grok image and video generator has a ‘spicy’ mode

    August 5, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    Science

    The Very Real Case for Brain-Computer Implants

    By News RoomAugust 5, 2025

    Lauren Goode: Yeah, I think it’s going to be really good.Michael Calore: Yeah.Lauren Goode: And…

    Best Hungryroot Promo Codes and Discounts for August 2025

    August 5, 2025

    Amazon pulls the plug on Sengled’s Alexa skill after months of outages

    August 4, 2025

    Amazon’s best Kindles are cheaper than ever at Best Buy

    August 4, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.