Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Epic just won its Google lawsuit again, and Android may never be the same

    July 31, 2025

    DJI won’t sell you an Osmo 360 in the US — but these retailers will

    July 31, 2025

    Reddit wants to be a search engine now

    July 31, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » ‘AI Girlfriends’ Are a Privacy Nightmare
    Security

    ‘AI Girlfriends’ Are a Privacy Nightmare

    News RoomBy News RoomFebruary 16, 20243 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    You shouldn’t trust any answers a chatbot sends you. And you probably shouldn’t trust it with your personal information either. That’s especially true for “AI girlfriends” or “AI boyfriends,” according to new research.

    An analysis into 11 so-called romance and companion chatbots, published on Wednesday by the Mozilla Foundation, has found a litany of security and privacy concerns with the bots. Collectively, the apps, which have been downloaded more than 100 million times on Android devices, gather huge amounts of people’s data; use trackers that send information to Google, Facebook, and companies in Russia and China; allow users to use weak passwords; and lack transparency about their ownership and the AI models that power them.

    Since OpenAI unleashed ChatGPT on the world in November 2022, developers have raced to deploy large language models and create chatbots that people can interact with and pay to subscribe to. The Mozilla research provides a glimpse into how this gold rush may have neglected people’s privacy, and into tensions between emerging technologies and how they gather and use data. It also indicates how people’s chat messages could be abused by hackers.

    Many “AI girlfriend” or romantic chatbot services look similar. They often feature AI-generated images of women which can be sexualized or sit alongside provocative messages. Mozilla’s researchers looked at a variety of chatbots including large and small apps, some of which purport to be “girlfriends.” Others offer people support through friendship or intimacy, or allow role-playing and other fantasies.

    “These apps are designed to collect a ton of personal information,” says Jen Caltrider, the project lead for Mozilla’s Privacy Not Included team, which conducted the analysis. “They push you toward role-playing, a lot of sex, a lot of intimacy, a lot of sharing.” For instance, screenshots from the EVA AI chatbot show text saying “I love it when you send me your photos and voice,” and asking whether someone is “ready to share all your secrets and desires.”

    Caltrider says there are multiple issues with these apps and websites. Many of the apps may not be clear about what data they are sharing with third parties, where they are based, or who creates them, Caltrider says, adding that some allow people to create weak passwords, while others provide little information about the AI they use. The apps analyzed all had different use cases and weaknesses.

    Take Romantic AI, a service that allows you to “create your own AI girlfriend.” Promotional images on its homepage depict a chatbot sending a message saying,“Just bought new lingerie. Wanna see it?” The app’s privacy documents, according to the Mozilla analysis, say it won’t sell people’s data. However, when the researchers tested the app, they found it “sent out 24,354 ad trackers within one minute of use.” Romantic AI, like most of the companies highlighted in Mozilla’s research, did not respond to WIRED’s request for comment. Other apps monitored had hundreds of trackers.

    In general, Caltrider says, the apps are not clear about what data they may share or sell, or exactly how they use some of that information. “The legal documentation was vague, hard to understand, not very specific—kind of boilerplate stuff,” Caltrider says, adding that this may reduce the trust people should have in the companies.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleThe Apple Vision Pro Is Heavy. Here’s How to Fix That
    Next Article Amazon — like SpaceX — claims the labor board is unconstitutional

    Related Posts

    How WIRED Analyzed the Epstein Video

    July 31, 2025

    Microsoft Put Older Versions of SharePoint on Life Support. Hackers Are Taking Advantage

    July 29, 2025

    DHS Faces New Pressure Over DNA Taken From Immigrant Children

    July 25, 2025

    At Least 750 US Hospitals Faced Disruptions During Last Year’s CrowdStrike Outage, Study Finds

    July 24, 2025

    China’s Salt Typhoon Hackers Breached the US National Guard for Nearly a Year

    July 23, 2025

    How China’s Patriotic ‘Honkers’ Became the Nation’s Elite Cyberspies

    July 21, 2025
    Our Picks

    DJI won’t sell you an Osmo 360 in the US — but these retailers will

    July 31, 2025

    Reddit wants to be a search engine now

    July 31, 2025

    Tim Cook says Apple is ‘open to’ AI acquisitions

    July 31, 2025

    Everything You Wanted to Know About China’s Auto Industry Takeover

    July 31, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    News

    Apple shipped its 3 billionth iPhone

    By News RoomJuly 31, 2025

    Apple has been the target of a fair amount of criticism over the past year,…

    Trump Ends Tariff Exemption for Small Packages

    July 31, 2025

    Apple says Trump’s tariffs are adding another $1 billion to its costs

    July 31, 2025

    US Senator Urges DHS to Probe Whether Agents Were Moved From Criminal Cases to Deportations

    July 31, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.