Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot
    Tenways’ compact e-bike twists and folds to go flat

    Tenways’ compact e-bike twists and folds to go flat

    February 13, 2026
    Spider-Noir looks like a hard-boiled thriller in first trailer

    Spider-Noir looks like a hard-boiled thriller in first trailer

    February 12, 2026
    The surprising case for AI judges

    The surprising case for AI judges

    February 12, 2026
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Anthropic has new rules for a more dangerous AI landscape
    News

    Anthropic has new rules for a more dangerous AI landscape

    News RoomBy News RoomAugust 15, 20252 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email
    Anthropic has new rules for a more dangerous AI landscape

    Anthropic has updated the usage policy for its Claude AI chatbot in response to growing concerns about safety. In addition to introducing stricter cybersecurity rules, Anthropic now specifies some of the most dangerous weapons that people should not develop using Claude.

    Anthropic doesn’t highlight the tweaks made to its weapons policy in the post summarizing its changes, but a comparison between the company’s old usage policy and its new one reveals a notable difference. Though Anthropic previously prohibited the use of Claude to “produce, modify, design, market, or distribute weapons, explosives, dangerous materials or other systems designed to cause harm to or loss of human life,” the updated version expands on this by specifically prohibiting the development of high-yield explosives, along with biological, nuclear, chemical, and radiological (CBRN) weapons.

    In May, Anthropic implemented “AI Safety Level 3” protection alongside the launch of its new Claude Opus 4 model. The safeguards are designed to make the model more difficult to jailbreak, as well as to help prevent it from assisting with the development of CBRN weapons.

    In its post, Anthropic also acknowledges the risks posed by agentic AI tools, including Computer Use, which lets Claude take control of a user’s computer, as well as Claude Code, a tool that embeds Claude directly into a developer’s terminal. “These powerful capabilities introduce new risks, including potential for scaled abuse, malware creation, and cyber attacks,” Anthropic writes.

    The AI startup is responding to these potential risks by folding a new “Do Not Compromise Computer or Network Systems” section into its usage policy. This section includes rules against using Claude to discover or exploit vulnerabilities, create or distribute malware, develop tools for denial-of-service attacks, and more.

    Additionally, Anthropic is loosening its policy around political content. Instead of banning the creation of all kinds of content related to political campaigns and lobbying, Anthropic will now only prohibit people from using Claude for “use cases that are deceptive or disruptive to democratic processes, or involve voter and campaign targeting.” The company also clarified that its requirements for all its “high-risk” use cases, which come into play when people use Claude to make recommendations to individuals or customers, only apply to consumer-facing scenarios, not for business use.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleA DOGE AI Tool Called SweetREX Is Coming to Slash US Government Regulation
    Next Article Anker’s 3-in-1 Qi2 charging station has returned to its Prime Day low

    Related Posts

    Tenways’ compact e-bike twists and folds to go flat

    Tenways’ compact e-bike twists and folds to go flat

    February 13, 2026
    Spider-Noir looks like a hard-boiled thriller in first trailer

    Spider-Noir looks like a hard-boiled thriller in first trailer

    February 12, 2026
    The surprising case for AI judges

    The surprising case for AI judges

    February 12, 2026
    Ring cancels its partnership with Flock Safety after surveillance backlash

    Ring cancels its partnership with Flock Safety after surveillance backlash

    February 12, 2026
    Do you believe in magic?

    Do you believe in magic?

    February 12, 2026
    YouTube is coming to the Apple Vision Pro

    YouTube is coming to the Apple Vision Pro

    February 12, 2026
    Our Picks
    Spider-Noir looks like a hard-boiled thriller in first trailer

    Spider-Noir looks like a hard-boiled thriller in first trailer

    February 12, 2026
    The surprising case for AI judges

    The surprising case for AI judges

    February 12, 2026
    Ring cancels its partnership with Flock Safety after surveillance backlash

    Ring cancels its partnership with Flock Safety after surveillance backlash

    February 12, 2026
    Do you believe in magic?

    Do you believe in magic?

    February 12, 2026
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    YouTube is coming to the Apple Vision Pro News

    YouTube is coming to the Apple Vision Pro

    By News RoomFebruary 12, 2026

    The Apple Vision Pro is finally getting an official visionOS YouTube app on Thursday. With…

    Jeffrey Epstein might not have created /pol/, but he helped carry out its mission

    Jeffrey Epstein might not have created /pol/, but he helped carry out its mission

    February 12, 2026
    Eufy’s midrange X10 Pro Omni robovac has fallen to its best-ever price

    Eufy’s midrange X10 Pro Omni robovac has fallen to its best-ever price

    February 12, 2026
    El Paso airspace closure was reportedly triggered by the CBP’s use of an anti-drone laser

    El Paso airspace closure was reportedly triggered by the CBP’s use of an anti-drone laser

    February 12, 2026
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2026 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.