Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    The Ploopy Knob is an open-source control dial for your PC

    July 4, 2025

    Laid-off workers should use AI to manage their emotions, says Xbox exec

    July 4, 2025

    Despite Protests, Elon Musk Secures Air Permit for xAI

    July 4, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Anyone Can Turn You Into an AI Chatbot. There’s Little You Can Do to Stop Them
    Games

    Anyone Can Turn You Into an AI Chatbot. There’s Little You Can Do to Stop Them

    News RoomBy News RoomOctober 19, 20243 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    Matthew Sag, a distinguished professor at Emory University who researches copyright and artificial intelligence, concurs. Even if a user creates a bot intentionally designed to cause emotional distress, the tech platform likely can’t be sued for that.

    He points out that Section 230 of the 1996 Communications Decency Act has long protected platforms at the federal level from being liable for certain harms to their users, even though various rights to publicity laws and privacy laws exist at the state level.

    “I’m not an anti-tech person by any means, but I really think Section 230 is just massively overbroad,” Sag says. “It’s well past time we replaced it with some kind of notice and takedown regime, a simple expedient system to say, ‘This is infringing on my rights to publicity,’ or ‘I have a good faith belief that there’s been an infliction of emotional distress,’ and then the companies would either have to take it down or lose their liability shield.”

    Character.AI, and other AI services like it, have also protected themselves by emphasizing that they serve up “artificial” conversations. “Remember, everything characters say is made up!” Character.AI warns at the bottom of its chats. Similarly, when Meta created chatbot versions of celebs in its messaging apps, the company headlined every conversation with a disclaimer. A chat with Snoop, for example, would lead with “Ya dig?! Unfortunately, I’m not Snoop D-O-double-G himself, but I can chat with you in his style if you’d like!”

    But while Meta’s system for messaging with celebrity chatbots is tightly controlled, Character.AI’s is a more open platform, with options for anyone to create and customize their own chatbot.

    Character.AI has also positioned its service as, essentially, personal. (Character.AI’s Instagram bio includes the tagline, “AI that feels alive.”) And while most users may be savvy enough to distinguish between a real-person conversation and one with an AI impersonator, others may develop attachments to these characters—especially if they’re facsimiles of a real person they feel they already know.

    In a conversation between the real-life Sarkeesian and a bot made of her without her knowledge or consent, the Character.AI bot told her that “every person is entitled to privacy.”

    “Privacy is important for maintaining a healthy life and relationships, and I think it’s important to set boundaries to keep certain things to myself,” the bot said in screenshots viewed by WIRED.

    Sarkeesian pushed the bot on this point. “Your intentions does not mean that harm hasn’t happened or that you did not cause harm,” she wrote.

    Character.AI’s bot agreed. “Even if my intentions were not malicious, there is still potential for harm,” it replied. “This is a complex issue with many factors to consider, including ethical concerns about using someone’s work without their consent. My programming and algorithms were developed to mimic the works of Anita Sarkeesian, without considering ethical implications, and that’s something that my creators should have thought through more thoroughly.”

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleSam Altman’s Eye-Scanning Orb Has a New Look—and Will Come Right to Your Door
    Next Article The Apple Keyboard Is Bad. Upgrade to the Nuio Flow Instead

    Related Posts

    ‘Persona 5: The Phantom X’ Brings the Series to Your Phone—and It’s Shockingly Good

    July 3, 2025

    I Relived My Misspent Youth With the Best Home Arcade Machines

    July 2, 2025

    These are 10 Best Nintendo Switch 2 Accessories We’ve Tried

    July 1, 2025

    ‘Dosa Divas’ Is a ‘Spicy’ New Game About Fighting Capitalism With Food

    June 26, 2025

    How Covid-19 Changed Hideo Kojima’s Vision for ‘Death Stranding 2’

    June 17, 2025

    Review: Nintendo Switch 2 Is Recognizably Amazing

    June 16, 2025
    Our Picks

    Laid-off workers should use AI to manage their emotions, says Xbox exec

    July 4, 2025

    Despite Protests, Elon Musk Secures Air Permit for xAI

    July 4, 2025

    This Is Why Tesla’s Robotaxi Launch Needed Human Babysitters

    July 4, 2025

    Fairphone 6 gets a 10/10 on repairability

    July 4, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    News

    New Galaxy Z Fold 7 leaks may give first real look at Samsung’s slimmer foldable

    By News RoomJuly 4, 2025

    Samsung’s upcoming Galaxy Z Fold 7 has been given the thinner, sleeker glow-up we expected,…

    This is not a tattoo robot

    July 4, 2025

    What Could a Healthy AI Companion Look Like?

    July 4, 2025

    A Former Chocolatier Shares the 7 Kitchen Scales She Recommends

    July 4, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.