Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot
    Ikea and Samsung promise glitch-free Matter integration

    Ikea and Samsung promise glitch-free Matter integration

    April 21, 2026
    Microsoft Teams is trying to fix accidental hand-raising

    Microsoft Teams is trying to fix accidental hand-raising

    April 21, 2026
    PlayStation’s age-gating restrictions are coming to UK consoles

    PlayStation’s age-gating restrictions are coming to UK consoles

    April 21, 2026
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » AI Code Hallucinations Increase the Risk of ‘Package Confusion’ Attacks
    Business

    AI Code Hallucinations Increase the Risk of ‘Package Confusion’ Attacks

    News RoomBy News RoomMay 1, 20253 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email
    AI Code Hallucinations Increase the Risk of ‘Package Confusion’ Attacks

    AI-generated computer code is rife with references to nonexistent third-party libraries, creating a golden opportunity for supply-chain attacks that poison legitimate programs with malicious packages that can steal data, plant backdoors, and carry out other nefarious actions, newly published research shows.

    The study, which used 16 of the most widely used large language models to generate 576,000 code samples, found that 440,000 of the package dependencies they contained were “hallucinated,” meaning they were nonexistent. Open source models hallucinated the most, with 21 percent of the dependencies linking to nonexistent libraries. A dependency is an essential code component that a separate piece of code requires to work properly. Dependencies save developers the hassle of rewriting code and are an essential part of the modern software supply chain.

    Package Hallucination Flashbacks

    These nonexistent dependencies represent a threat to the software supply chain by exacerbating so-called dependency confusion attacks. These attacks work by causing a software package to access the wrong component dependency, for instance by publishing a malicious package and giving it the same name as the legitimate one but with a later version stamp. Software that depends on the package will, in some cases, choose the malicious version rather than the legitimate one because the former appears to be more recent.

    Also known as package confusion, this form of attack was first demonstrated in 2021 in a proof-of-concept exploit that executed counterfeit code on networks belonging to some of the biggest companies on the planet, Apple, Microsoft, and Tesla included. It’s one type of technique used in software supply-chain attacks, which aim to poison software at its very source in an attempt to infect all users downstream.

    “Once the attacker publishes a package under the hallucinated name, containing some malicious code, they rely on the model suggesting that name to unsuspecting users,” Joseph Spracklen, a University of Texas at San Antonio PhD student and lead researcher, told Ars via email. “If a user trusts the LLM’s output and installs the package without carefully verifying it, the attacker’s payload, hidden in the malicious package, would be executed on the user’s system.”

    In AI, hallucinations occur when an LLM produces outputs that are factually incorrect, nonsensical, or completely unrelated to the task it was assigned. Hallucinations have long dogged LLMs because they degrade their usefulness and trustworthiness and have proven vexingly difficult to predict and remedy. In a paper scheduled to be presented at the 2025 USENIX Security Symposium, they have dubbed the phenomenon “package hallucination.”

    For the study, the researchers ran 30 tests, 16 in the Python programming language and 14 in JavaScript, that generated 19,200 code samples per test, for a total of 576,000 code samples. Of the 2.23 million package references contained in those samples, 440,445, or 19.7 percent, pointed to packages that didn’t exist. Among these 440,445 package hallucinations, 205,474 had unique package names.

    One of the things that makes package hallucinations potentially useful in supply-chain attacks is that 43 percent of package hallucinations were repeated over 10 queries. “In addition,” the researchers wrote, “58 percent of the time, a hallucinated package is repeated more than once in 10 iterations, which shows that the majority of hallucinations are not simply random errors but a repeatable phenomenon that persists across multiple iterations. This is significant, because a persistent hallucination is more valuable for malicious actors looking to exploit this vulnerability and makes the hallucination attack vector a more viable threat.”

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleEurope’s Devastating Power Outage in Photos
    Next Article ‘Cook chose poorly’: how Apple blew up its control over the App Store

    Related Posts

    What Happens When Your Coworkers Are AI Agents

    What Happens When Your Coworkers Are AI Agents

    December 9, 2025
    San Francisco Mayor Daniel Lurie: ‘We Are a City on the Rise’

    San Francisco Mayor Daniel Lurie: ‘We Are a City on the Rise’

    December 9, 2025
    An AI Dark Horse Is Rewriting the Rules of Game Design

    An AI Dark Horse Is Rewriting the Rules of Game Design

    December 9, 2025
    Watch the Highlights From WIRED’s Big Interview Event Right Here

    Watch the Highlights From WIRED’s Big Interview Event Right Here

    December 9, 2025
    Amazon Has New Frontier AI Models—and a Way for Customers to Build Their Own

    Amazon Has New Frontier AI Models—and a Way for Customers to Build Their Own

    December 4, 2025
    AWS CEO Matt Garman Wants to Reassert Amazon’s Cloud Dominance in the AI Era

    AWS CEO Matt Garman Wants to Reassert Amazon’s Cloud Dominance in the AI Era

    December 4, 2025
    Our Picks
    Microsoft Teams is trying to fix accidental hand-raising

    Microsoft Teams is trying to fix accidental hand-raising

    April 21, 2026
    PlayStation’s age-gating restrictions are coming to UK consoles

    PlayStation’s age-gating restrictions are coming to UK consoles

    April 21, 2026
    WhatsApp tests ‘Plus’ subscription that adds stickers and more for a few bucks a month

    WhatsApp tests ‘Plus’ subscription that adds stickers and more for a few bucks a month

    April 21, 2026
    Dyson’s back with a travel-size Supersonic hairdryer

    Dyson’s back with a travel-size Supersonic hairdryer

    April 21, 2026
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    Silicon Valley has forgotten what normal people want News

    Silicon Valley has forgotten what normal people want

    By News RoomApril 20, 2026

    One of the most mortifying things about knowing a lot of techies is listening to…

    Here’s how Amazon’s price fixing allegedly drove up prices everywhere

    Here’s how Amazon’s price fixing allegedly drove up prices everywhere

    April 20, 2026
    Apple CEO Tim Cook is stepping down

    Apple CEO Tim Cook is stepping down

    April 20, 2026
    Tim Cook will still be Apple’s Trump whisperer

    Tim Cook will still be Apple’s Trump whisperer

    April 20, 2026
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2026 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.