Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot
    The DJI Romo robovac had security so poor, this man remotely accessed thousands of them

    The DJI Romo robovac had security so poor, this man remotely accessed thousands of them

    February 14, 2026
    DJI’s first robovac is an autonomous cleaning drone you can’t trust

    DJI’s first robovac is an autonomous cleaning drone you can’t trust

    February 14, 2026
    Ring’s adorable surveillance hellscape

    Ring’s adorable surveillance hellscape

    February 13, 2026
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Reduce AI Hallucinations With This Neat Software Trick
    Gear

    Reduce AI Hallucinations With This Neat Software Trick

    News RoomBy News RoomJune 14, 20243 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email
    Reduce AI Hallucinations With This Neat Software Trick

    To start off, not all RAGs are of the same caliber. The accuracy of the content in the custom database is critical for solid outputs, but that isn’t the only variable. “It’s not just the quality of the content itself,” says Joel Hron, a global head of AI at Thomson Reuters. “It’s the quality of the search, and retrieval of the right content based on the question.” Mastering each step in the process is critical since one misstep can throw the model completely off.

    “Any lawyer who’s ever tried to use a natural language search within one of the research engines will see that there are often instances where semantic similarity leads you to completely irrelevant materials,” says Daniel Ho, a Stanford professor and senior fellow at the Institute for Human-Centered AI. Ho’s research into AI legal tools that rely on RAG found a higher rate of mistakes in outputs than the companies building the models found.

    Which brings us to the thorniest question in the discussion: How do you define hallucinations within a RAG implementation? Is it only when the chatbot generates a citation-less output and makes up information? Is it also when the tool may overlook relevant data or misinterpret aspects of a citation?

    According to Lewis, hallucinations in a RAG system boil down to whether the output is consistent with what’s found by the model during data retrieval. Though, the Stanford research into AI tools for lawyers broadens this definition a bit by examining whether the output is grounded in the provided data as well as whether it’s factually correct—a high bar for legal professionals who are often parsing complicated cases and navigating complex hierarchies of precedent.

    While a RAG system attuned to legal issues is clearly better at answering questions on case law than OpenAI’s ChatGPT or Google’s Gemini, it can still overlook the finer details and make random mistakes. All of the AI experts I spoke with emphasized the continued need for thoughtful, human interaction throughout the process to double check citations and verify the overall accuracy of the results.

    Law is an area where there’s a lot of activity around RAG-based AI tools, but the process’s potential is not limited to a single white-collar job. “Take any profession or any business. You need to get answers that are anchored on real documents,” says Arredondo. “So, I think RAG is going to become the staple that is used across basically every professional application, at least in the near to mid-term.” Risk-averse executives seem excited about the prospect of using AI tools to better understand their proprietary data without having to upload sensitive info to a standard, public chatbot.

    It’s critical, though, for users to understand the limitations of these tools, and for AI-focused companies to refrain from overpromising the accuracy of their answers. Anyone using an AI tool should still avoid trusting the output entirely, and they should approach its answers with a healthy sense of skepticism even if the answer is improved through RAG.

    “Hallucinations are here to stay,” says Ho. “We do not yet have ready ways to really eliminate hallucinations.” Even when RAG reduces the prevalence of errors, human judgment reigns paramount. And that’s no lie.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleLet’s speculate wildly about Tesla’s three mystery vehicles
    Next Article Apple gives Apple Home users something they’ve been begging for

    Related Posts

    Spin Bike Like Jess King: Inside the Popular Peloton Coach’s Starter Pack

    Spin Bike Like Jess King: Inside the Popular Peloton Coach’s Starter Pack

    December 10, 2025
    Get (or Gift) 2 Years of Spectacular Shaves for  Right Now

    Get (or Gift) 2 Years of Spectacular Shaves for $80 Right Now

    December 9, 2025
    iFixit Put a Chatbot Repair Expert in an App

    iFixit Put a Chatbot Repair Expert in an App

    December 9, 2025
    The Best Dutch Oven, Pizza Oven, or Air Fryer for Home Cooks

    The Best Dutch Oven, Pizza Oven, or Air Fryer for Home Cooks

    December 9, 2025
    JBL’s Grip Is a Bluetooth Speaker With Lava Lamp Vibes

    JBL’s Grip Is a Bluetooth Speaker With Lava Lamp Vibes

    December 9, 2025
    Can Bike Riders and Self-Driving Cars Be Friends?

    Can Bike Riders and Self-Driving Cars Be Friends?

    December 9, 2025
    Our Picks
    DJI’s first robovac is an autonomous cleaning drone you can’t trust

    DJI’s first robovac is an autonomous cleaning drone you can’t trust

    February 14, 2026
    Ring’s adorable surveillance hellscape

    Ring’s adorable surveillance hellscape

    February 13, 2026
    4chan’s creator says ‘Epstein had nothing to do’ with creating infamous far-right board /pol/

    4chan’s creator says ‘Epstein had nothing to do’ with creating infamous far-right board /pol/

    February 13, 2026
    iRobot’s Roombas have a new Chinese owner, but it says your data will remain in the US

    iRobot’s Roombas have a new Chinese owner, but it says your data will remain in the US

    February 13, 2026
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    The see-through Beats Studio Buds Plus are more than 40 percent off for Presidents Day News

    The see-through Beats Studio Buds Plus are more than 40 percent off for Presidents Day

    By News RoomFebruary 13, 2026

    If you like how well AirPods work with iPhones but want something more colorful and…

    Trump Mobile’s origins lie with a Mexican middleweight boxer

    Trump Mobile’s origins lie with a Mexican middleweight boxer

    February 13, 2026
    The latest Steam beta lets users add their PC specs to reviews

    The latest Steam beta lets users add their PC specs to reviews

    February 13, 2026
    Can Democrats post their way to midterm victories?

    Can Democrats post their way to midterm victories?

    February 13, 2026
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2026 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.