Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Are Meal Kits Cheaper than Groceries in 2025? We Break It Down

    May 18, 2025

    Amazon claims its ‘constantly inviting’ new customers to Alexa Plus

    May 17, 2025

    Epic asks judge to make Apple let Fortnite back on the US App Store

    May 17, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » The Dark Side of Open Source AI Image Generators
    Business

    The Dark Side of Open Source AI Image Generators

    News RoomBy News RoomMarch 6, 20243 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    Whether through the frowning high-definition face of a chimpanzee or a psychedelic, pink-and-red-hued doppelganger of himself, Reuven Cohen uses AI-generated images to catch people’s attention. “I’ve always been interested in art and design and video and enjoy pushing boundaries,” he says—but the Toronto-based consultant, who helps companies develop AI tools, also hopes to raise awareness of the technology’s darker uses.

    “It can also be specifically trained to be quite gruesome and bad in a whole variety of ways,” Cohen says. He’s a fan of the freewheeling experimentation that has been unleashed by open source image-generation technology. But that same freedom enables the creation of explicit images of women used for harassment.

    After nonconsensual images of Taylor Swift recently spread on X, Microsoft added new controls to its image generator. Open source models can be commandeered by just about anyone and generally come without guardrails. Despite the efforts of some hopeful community members to deter exploitative uses, the open source free-for-all is near-impossible to control, experts say.

    “Open source has powered fake image abuse and nonconsensual pornography. That’s impossible to sugarcoat or qualify,” says Henry Ajder, who has spent years researching harmful use of generative AI.

    Ajder says that at the same time that it’s becoming a favorite of researchers, creatives like Cohen, and academics working on AI, open source image generation software has become the bedrock of deepfake porn. Some tools based on open source algorithms are purpose-built for salacious or harassing uses, such as “nudifying” apps that digitally remove women’s clothes in images.

    But many tools can serve both legitimate and harassing use cases. One popular open source face-swapping program is used by people in the entertainment industry and as the “tool of choice for bad actors” making nonconsensual deepfakes, Ajder says. High-resolution image generator Stable Diffusion, developed by startup Stability AI, is claimed to have more than 10 million users and has guardrails installed to prevent explicit image creation and policies barring malicious use. But the company also open sourced a version of the image generator in 2022 that is customizable, and online guides explain how to bypass its built-in limitations.

    Meanwhile, smaller AI models known as LoRAs make it easy to tune a Stable Diffusion model to output images with a particular style, concept, or pose—such as a celebrity’s likeness or certain sexual acts. They are widely available on AI model marketplaces such as Civitai, a community-based site where users share and download models. There, one creator of a Taylor Swift plug-in has urged others not to use it “for NSFW images.” However, once downloaded, its use is out of its creator’s control. “The way that open source works means it’s going to be pretty hard to stop someone from potentially hijacking that,” says Ajder.

    If you want to create a fake of someone in a compromising position, this makes it simple.

    Reuven Cohen, AI consultant

    4chan, the image-based message board site with a reputation for chaotic moderation is home to pages devoted to nonconsensual deepfake porn, WIRED found, made with openly available programs and AI models dedicated solely to sexual images. Message boards for adult images are littered with AI-generated nonconsensual nudes of real women, from porn performers to actresses like Cate Blanchett. WIRED also observed 4chan users sharing workarounds for NSFW images using OpenAI’s Dall-E 3.

    That kind of activity has inspired some users in communities dedicated to AI image-making, including on Reddit and Discord, to attempt to push back against the sea of pornographic and malicious images. Creators also express worry about the software gaining a reputation for NSFW images, encouraging others to report images depicting minors on Reddit and model-hosting sites.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleHeadspace XR made me forget how much I hate meditation tech
    Next Article How much energy will new semiconductor factories burn through in the US?

    Related Posts

    No, Graduates: AI Hasn’t Ended Your Career Before It Starts

    May 17, 2025

    The Middle East Has Entered the AI Group Chat

    May 16, 2025

    US Tech Visa Applications Are Being Put Through the Wringer

    May 16, 2025

    Blocked From Selling Off-Brand Ozempic, Telehealth Startups Embrace a Less Effective Drug

    May 16, 2025

    Elon Musk’s Grok AI Can’t Stop Talking About ‘White Genocide’

    May 15, 2025

    Microsoft Cuts Off Access to Bing Search Data as It Shifts Focus to Chatbots

    May 15, 2025
    Our Picks

    Amazon claims its ‘constantly inviting’ new customers to Alexa Plus

    May 17, 2025

    Epic asks judge to make Apple let Fortnite back on the US App Store

    May 17, 2025

    The Verge’s 2025 graduation gift guide

    May 17, 2025

    It’s time for Logitech to make a real Forever Mouse

    May 17, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    News

    Huawei’s first trifold is a great phone that you shouldn’t buy

    By News RoomMay 17, 2025

    Let’s get one thing out of the way immediately: you shouldn’t buy Huawei’s trifold phone,…

    How to Reduce the Battery Drain Caused by Your Web Browser

    May 17, 2025

    No, Graduates: AI Hasn’t Ended Your Career Before It Starts

    May 17, 2025

    Google I/O will be an AI show

    May 17, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.