Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Apple’s big updates for Intel-based Macs will end with Tahoe

    June 9, 2025

    iFixit Says Switch 2 Is Harder to Repair, Probably Still Drift Prone

    June 9, 2025

    The biggest changes coming to your iPhone with iOS 26

    June 9, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Anyone Can Turn You Into an AI Chatbot. There’s Little You Can Do to Stop Them
    Games

    Anyone Can Turn You Into an AI Chatbot. There’s Little You Can Do to Stop Them

    News RoomBy News RoomOctober 19, 20243 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    Matthew Sag, a distinguished professor at Emory University who researches copyright and artificial intelligence, concurs. Even if a user creates a bot intentionally designed to cause emotional distress, the tech platform likely can’t be sued for that.

    He points out that Section 230 of the 1996 Communications Decency Act has long protected platforms at the federal level from being liable for certain harms to their users, even though various rights to publicity laws and privacy laws exist at the state level.

    “I’m not an anti-tech person by any means, but I really think Section 230 is just massively overbroad,” Sag says. “It’s well past time we replaced it with some kind of notice and takedown regime, a simple expedient system to say, ‘This is infringing on my rights to publicity,’ or ‘I have a good faith belief that there’s been an infliction of emotional distress,’ and then the companies would either have to take it down or lose their liability shield.”

    Character.AI, and other AI services like it, have also protected themselves by emphasizing that they serve up “artificial” conversations. “Remember, everything characters say is made up!” Character.AI warns at the bottom of its chats. Similarly, when Meta created chatbot versions of celebs in its messaging apps, the company headlined every conversation with a disclaimer. A chat with Snoop, for example, would lead with “Ya dig?! Unfortunately, I’m not Snoop D-O-double-G himself, but I can chat with you in his style if you’d like!”

    But while Meta’s system for messaging with celebrity chatbots is tightly controlled, Character.AI’s is a more open platform, with options for anyone to create and customize their own chatbot.

    Character.AI has also positioned its service as, essentially, personal. (Character.AI’s Instagram bio includes the tagline, “AI that feels alive.”) And while most users may be savvy enough to distinguish between a real-person conversation and one with an AI impersonator, others may develop attachments to these characters—especially if they’re facsimiles of a real person they feel they already know.

    In a conversation between the real-life Sarkeesian and a bot made of her without her knowledge or consent, the Character.AI bot told her that “every person is entitled to privacy.”

    “Privacy is important for maintaining a healthy life and relationships, and I think it’s important to set boundaries to keep certain things to myself,” the bot said in screenshots viewed by WIRED.

    Sarkeesian pushed the bot on this point. “Your intentions does not mean that harm hasn’t happened or that you did not cause harm,” she wrote.

    Character.AI’s bot agreed. “Even if my intentions were not malicious, there is still potential for harm,” it replied. “This is a complex issue with many factors to consider, including ethical concerns about using someone’s work without their consent. My programming and algorithms were developed to mimic the works of Anita Sarkeesian, without considering ethical implications, and that’s something that my creators should have thought through more thoroughly.”

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleSam Altman’s Eye-Scanning Orb Has a New Look—and Will Come Right to Your Door
    Next Article The Apple Keyboard Is Bad. Upgrade to the Nuio Flow Instead

    Related Posts

    iFixit Says Switch 2 Is Harder to Repair, Probably Still Drift Prone

    June 9, 2025

    ‘Mario Kart World’ Devs Broke Their Own Rule on Who Gets to Drive

    June 7, 2025

    The Switch 2 May Signal the End of Physical Games

    June 3, 2025

    ‘Grand Theft Auto’ Publisher Swaps DEI for ‘Diversity of Thought’ in Annual Report

    June 3, 2025

    A Gaming YouTuber Says an AI-Generated Clone of His Voice Is Being Used to Narrate ‘Doom’ Videos

    May 28, 2025

    Samsung’s G8 QD-OLED Gaming Monitor Is the Prettiest Screen You’ll Find

    May 23, 2025
    Our Picks

    iFixit Says Switch 2 Is Harder to Repair, Probably Still Drift Prone

    June 9, 2025

    The biggest changes coming to your iPhone with iOS 26

    June 9, 2025

    Apple’s Liquid Glass redesign doesn’t look like much

    June 9, 2025

    Apple WWDC 2025: the 13 biggest announcements

    June 9, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    News

    Apple’s Spotlight upgrades look like a power-user dream

    By News RoomJune 9, 2025

    This year’s WWDC is shaping up to be surprisingly focused on power users. There are…

    Elon Musk’s Fight With Trump Threatens $48 Billion in Government Contracts

    June 9, 2025

    Apple launches iPadOS 26

    June 9, 2025

    WWDC 2025: all the news from Apple’s annual developer conference

    June 9, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.