Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Apple’s Spotlight upgrades look like a power-user dream

    June 9, 2025

    Elon Musk’s Fight With Trump Threatens $48 Billion in Government Contracts

    June 9, 2025

    Apple launches iPadOS 26

    June 9, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » A key part of California’s online safety law for kids is still on hold after appeals court ruling
    News

    A key part of California’s online safety law for kids is still on hold after appeals court ruling

    News RoomBy News RoomAugust 16, 20244 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    The panel of judges on the Ninth Circuit Court of Appeals objected to a portion of the California Age-Appropriate Design Code Act — specifically, a requirement that online businesses “opine on and mitigate the risk that children may be exposed to harmful or potentially harmful materials online.” The rule “facially violates the First Amendment,” the appeals court concluded. As a result, it’s upholding a preliminary injunction on that portion of the law and related aspects.

    But it sent another part of the law back to the lower court to reconsider and vacated the rest of the preliminary injunction, saying it was unclear if the rest of the law violated the First Amendment. The panel believes it’s “too early” to say if the unconstitutional parts of the statute could be feasibly cut off from the rest.

    The ruling, authored by Judge Milan Smith Jr., singles out the design code’s Data Protection Impact Assessment (DPIA) requirement. The DPIA would compel online businesses to craft reports on whether their designs could harm kids and “create a timed plan to mitigate or eliminate the risk[s].” Smith determined this would likely fail First Amendment scrutiny. California “could have easily employed less restrictive means to accomplish its protective goals,” he wrote, including incentives for voluntary content filters, education for children and parents, and the enforcement of existing criminal laws.

    Instead, he added, the state’s law “attempts to indirectly censor the material available to children online, by delegating the controversial question of what content may ‘harm to children’ to the companies themselves.”

    That could be an ominous sign for other legislation like the Kids Online Safety Act (KOSA), which recently passed the Senate 91–3. KOSA demands platforms take reasonable steps to protect kids from certain kinds of harms, including mental health disorders like anxiety and depression.

    Still, the judges ruled that other parts of the Age-Appropriate Design Code Act may not violate the First Amendment in every possible application of the law. Smith pointed to provisions like banning dark patterns that encourage kids to give over more information than what’s necessary to operate the service. “Based on the record developed so far in this litigation, it is unclear whether a ‘dark pattern’ itself constitutes protected speech and whether a ban on using ‘dark patterns’ should always trigger First Amendment scrutiny, and the district court never grappled with this question.”

    Smith’s ruling also said that the district court should have evaluated more closely whether other parts of the law could be upheld when applied to non-social media companies covered by the bill.

    The ruling is the latest relative victory in NetChoice’s string of lawsuits against state-level internet regulations, including laws aimed at protecting children online. Courts have agreed with many of the First Amendment arguments that the group, which represents companies like Meta and Google, has made against such laws.

    It’s also significant as it comes after an instructive Supreme Court ruling earlier this year in Moody v. NetChoice, which affirmed that content moderation and curation by platforms is protected speech. The justices expressed skepticism about bringing facial challenges — which assert that any possible application of a law is unconstitutional — under the First Amendment in such cases. Even so, Smith wrote that the case against the DPIA requirement of California’s law is facially unconstitutional because “in every application to a covered business, [it] raises the same First Amendment issues.”

    The California attorney general’s office did not immediately respond to a request for comment. NetChoice Litigation Center director Chris Marchese called the ruling “a victory for free expression, online security and Californian families.” He added, “The court recognized that California’s government cannot commandeer private businesses to censor lawful content online or to restrict access to it.”

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleYour Gym Locker May Be Hackable
    Next Article OpenAI says Iran tried to influence US elections with ChatGPT

    Related Posts

    Apple’s Spotlight upgrades look like a power-user dream

    June 9, 2025

    Apple launches iPadOS 26

    June 9, 2025

    WWDC 2025: all the news from Apple’s annual developer conference

    June 9, 2025

    YouTube has loosened its content moderation policies

    June 9, 2025

    China shuts down AI tools during nationwide college exams

    June 9, 2025

    Warner Bros. Discovery is splitting into two companies

    June 9, 2025
    Our Picks

    Elon Musk’s Fight With Trump Threatens $48 Billion in Government Contracts

    June 9, 2025

    Apple launches iPadOS 26

    June 9, 2025

    WWDC 2025: all the news from Apple’s annual developer conference

    June 9, 2025

    A weekend with the Nintendo Switch 2

    June 9, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    News

    YouTube has loosened its content moderation policies

    By News RoomJune 9, 2025

    YouTube has relaxed its moderation policies and is now instructing reviewers not to remove content…

    ICE Quietly Scales Back Rules for Courthouse Raids

    June 9, 2025

    China shuts down AI tools during nationwide college exams

    June 9, 2025

    Warner Bros. Discovery is splitting into two companies

    June 9, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.