Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Save 20% With VistaPrint Coupons for July 2025

    July 1, 2025

    OpenAI Leadership Responds to Meta Offers: ‘Someone Has Broken Into Our Home’

    June 30, 2025

    Microsoft Authenticator is ending support for passwords

    June 30, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » A Lawsuit Against Perplexity Calls Out Fake News AI Hallucinations
    Business

    A Lawsuit Against Perplexity Calls Out Fake News AI Hallucinations

    News RoomBy News RoomOctober 23, 20243 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    Perplexity did not respond to requests for comment.

    In a statement emailed to WIRED, News Corp chief executive Robert Thomson compared Perplexity unfavorably to OpenAI. “We applaud principled companies like OpenAI, which understands that integrity and creativity are essential if we are to realize the potential of Artificial Intelligence,” the statement says. “Perplexity is not the only AI company abusing intellectual property and it is not the only AI company that we will pursue with vigor and rigor. We have made clear that we would rather woo than sue, but, for the sake of our journalists, our writers and our company, we must challenge the content kleptocracy.”

    OpenAI is facing its own accusations of trademark dilution, though. In New York Times v. OpenAI, the Times alleges that ChatGPT and Bing Chat will attribute made-up quotes to the Times, and accuses OpenAI and Microsoft of damaging its reputation through trademark dilution. In one example cited in the lawsuit, the Times alleges that Bing Chat claimed that the Times called red wine (in moderation) a “heart-healthy” food, when in fact it did not; the Times argues that its actual reporting has debunked claims about the healthfulness of moderate drinking.

    “Copying news articles to operate substitutive, commercial generative AI products is unlawful, as we made clear in our letters to Perplexity and our litigation against Microsoft and OpenAI,” says NYT director of external communications Charlie Stadtlander. “We applaud this lawsuit from Dow Jones and the New York Post, which is an important step toward ensuring that publisher content is protected from this kind of misappropriation.”

    Some legal experts are unsure that the false designation of origin and trademark dilution charge will be fruitful. Intellectual property lawyer Vincent Allen, a partner at Carstens, Allen & Gourley, believes that the copyright infringement claims in this lawsuit are stronger, and that he will “be surprised” if the false designation of origin charge stands. Both Allen and James Grimmelmann, a professor of digital and internet law at Cornell University, believe that the landmark trademark case, Dastar v. Twentieth Century Fox Film Corp., could stymie this line of attack. (In that ruling, about a dispute over old World War II footage, the Supreme Court held that “origin” doesn’t apply to authorship for trademark law, but is instead limited to tangible goods—like a bootleg purse—rather than counterfeit creative work like films. Further, Grimmelmann is skeptical that the trademark dilution claim will hold water. “Dilution involves the use of a trademark on ones own goods or services in a way that impairs the distinctiveness of a famous mark. I … just don’t see that here,” he says.

    If publishers prevail in arguing that hallucinations can violate trademark law, AI companies could face “immense difficulties” according to Matthew Sag, a professor of law and artificial intelligence at Emory University.

    “It is absolutely impossible to guarantee that a language model will not hallucinate,” Sag says. In his view, the way language models operate by predicting words that sound correct in response to prompts is always a type of hallucination—sometimes it’s just more plausible-sounding than others.

    “We only call it a hallucination if it doesn’t match up with our reality, but the process is exactly the same whether we like the output or not.”

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleThe Boox Palma 2 has a faster processor and adds a fingerprint reader
    Next Article Apple is ‘concerned’ about AI turning real photos into ‘fantasy’

    Related Posts

    OpenAI Leadership Responds to Meta Offers: ‘Someone Has Broken Into Our Home’

    June 30, 2025

    OpenAI Loses 4 Key Researchers to Meta

    June 30, 2025

    OpenAI’s Unreleased AGI Paper Could Complicate Microsoft Negotiations

    June 30, 2025

    Substack Is Having a Moment—Again. But Time Is Running Out

    June 29, 2025

    No One Is in Charge at the US Copyright Office

    June 28, 2025

    Disney Just Threw a Punch in a Major AI Fight

    June 27, 2025
    Our Picks

    OpenAI Leadership Responds to Meta Offers: ‘Someone Has Broken Into Our Home’

    June 30, 2025

    Microsoft Authenticator is ending support for passwords

    June 30, 2025

    AT&T says ‘our network’ wasn’t to blame for Trump’s troubled conference call

    June 30, 2025

    The government’s Apple antitrust lawsuit is still on

    June 30, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    News

    Apple’s AI Siri might be powered by OpenAI

    By News RoomJune 30, 2025

    Apple is considering enlisting the help of OpenAI or Anthropic to power its AI-upgraded Siri,…

    The best Switch 2 screen protector you should buy

    June 30, 2025

    The Nintendo Switch 2 will be available in-store at Best Buy on July 1st

    June 30, 2025

    Telegram Purged Chinese Crypto Scam Markets—Then Watched as They Rebuilt

    June 30, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.