Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot
    The Galaxy S26 is a photography nightmare

    The Galaxy S26 is a photography nightmare

    February 27, 2026
    Here’s your first look at Kratos in Amazon’s God of War show

    Here’s your first look at Kratos in Amazon’s God of War show

    February 27, 2026
    Amazon’s Fire TV Stick 4K Plus gets a better interface and a 40 percent discount

    Amazon’s Fire TV Stick 4K Plus gets a better interface and a 40 percent discount

    February 27, 2026
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » AI image training dataset found to include child sexual abuse imagery
    News

    AI image training dataset found to include child sexual abuse imagery

    News RoomBy News RoomDecember 20, 20232 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email
    AI image training dataset found to include child sexual abuse imagery

    LAION-5B, a dataset used by Stable Diffusion creator Stability AI and Google’s Imagen image generators, included at least 1,679 illegal images scraped from social media posts and popular adult websites. 

    The researchers began combing through the LAION dataset in September 2023 to investigate how much, if any, child sexual abuse material (CSAM) was present. They looked through hashes or the image’s identifiers. These were sent to CSAM detection platforms like PhotoDNA and verified by the Canadian Centre for Child Protection. 

    The dataset does not keep repositories of the images, according to the LAION website. It indexes the internet and contains links to images and alt text that it scrapes. 

    LAION, the nonprofit that manages the dataset, told Bloomberg it has a “zero-tolerance” policy for harmful content and would temporarily remove the datasets online. Stability AI told the publication that it has guidelines against the misuse of its platforms. The company said that while it trained its models with LAION-5B, it focused on a portion of the dataset and fine-tuned it for safety. 

    Stanford’s researchers said the presence of CSAM does not necessarily influence the output of models trained on the dataset. Still, there’s always the possibility the model learned something from the images. 

    “The presence of repeated identical instances of CSAM is also problematic, particularly due to its reinforcement of images of specific victims,” the report said. 

    The researchers acknowledged it would be difficult to fully remove the problematic content, especially from the AI models trained on it. They recommended that models trained on LAION-5B, such as Stable Diffusion 1.5, “should be deprecated and distribution ceased where feasible.” Google released a new version of Imagen but has not made public which dataset it trained on. 

    US attorneys general have called on Congress to set up a committee to investigate the impact of AI on child exploitation and prohibit the creation of AI-generated CSAM. 

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleLamborghini’s Revuelto Is the Outstanding Hybrid of 2023
    Next Article Everything You Need to Work From Home Like a Pro

    Related Posts

    The Galaxy S26 is a photography nightmare

    The Galaxy S26 is a photography nightmare

    February 27, 2026
    Here’s your first look at Kratos in Amazon’s God of War show

    Here’s your first look at Kratos in Amazon’s God of War show

    February 27, 2026
    Amazon’s Fire TV Stick 4K Plus gets a better interface and a 40 percent discount

    Amazon’s Fire TV Stick 4K Plus gets a better interface and a 40 percent discount

    February 27, 2026
    AI deepfakes are a train wreck and Samsung’s selling tickets

    AI deepfakes are a train wreck and Samsung’s selling tickets

    February 27, 2026
    The Trump phone sure looks a lot like this HTC handset

    The Trump phone sure looks a lot like this HTC handset

    February 27, 2026
    CISA is getting a new acting director after less than a year

    CISA is getting a new acting director after less than a year

    February 27, 2026
    Our Picks
    Here’s your first look at Kratos in Amazon’s God of War show

    Here’s your first look at Kratos in Amazon’s God of War show

    February 27, 2026
    Amazon’s Fire TV Stick 4K Plus gets a better interface and a 40 percent discount

    Amazon’s Fire TV Stick 4K Plus gets a better interface and a 40 percent discount

    February 27, 2026
    AI deepfakes are a train wreck and Samsung’s selling tickets

    AI deepfakes are a train wreck and Samsung’s selling tickets

    February 27, 2026
    The Trump phone sure looks a lot like this HTC handset

    The Trump phone sure looks a lot like this HTC handset

    February 27, 2026
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    CISA is getting a new acting director after less than a year News

    CISA is getting a new acting director after less than a year

    By News RoomFebruary 27, 2026

    The US Cybersecurity and Infrastructure Security Agency (CISA), which is part of the Department of…

    AI vs. the Pentagon: killer robots, mass surveillance, and red lines

    AI vs. the Pentagon: killer robots, mass surveillance, and red lines

    February 27, 2026
    The US military reportedly shot down a CBP drone with a laser

    The US military reportedly shot down a CBP drone with a laser

    February 27, 2026
    We don’t have to have unsupervised killer robots

    We don’t have to have unsupervised killer robots

    February 27, 2026
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2026 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.