Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Apple’s colorful Watch Solo Loop bands are up to 70 percent off now

    July 3, 2025

    How to Protest Safely in the Age of Surveillance

    July 3, 2025

    E Ink is turning the laptop touchpad into an e-reader for AI apps

    July 3, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » US Lawmakers Tell DOJ to Quit Blindly Funding ‘Predictive’ Police Tools
    Business

    US Lawmakers Tell DOJ to Quit Blindly Funding ‘Predictive’ Police Tools

    News RoomBy News RoomJanuary 31, 20243 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    The United States Department of Justice has failed to convince a group of US lawmakers that state and local police agencies aren’t awarded federal grants to buy AI-based “policing” tools known to be inaccurate, if not prone to exacerbating biases long observed in US police forces.

    Seven members of Congress wrote in a letter to the DOJ, first obtained by WIRED, that the information they pried loose from the agency had only served to inflame their concerns about the DOJ’s police grant program. Nothing in its responses so far, the lawmakers said, indicates the government has bothered to investigate whether departments awarded grants bought discriminatory policing software.

    “We urge you to halt all Department of Justice grants for predictive policing systems until the DOJ can ensure that grant recipients will not use such systems in ways that have a discriminatory impact,” the letter reads. The Justice Department previously acknowledged that it had not kept track of whether police departments were using the funding, awarded under the Edward Byrne Memorial Justice Assistance Grant Program, to purchase so-called predictive policing tools.

    Led by Senator Ron Wyden, a Democrat of Oregon, the lawmakers say the DOJ is required by law to “periodically review” whether grant recipients comply with Title VI of the nation’s Civil Rights Act. The DOJ is patently forbidden, they explain, from funding programs shown to discriminate on the basis of race, ethnicity, or national origin, whether that outcome is intentional or not.

    Independent investigations in the press have found that popular “predictive” policing tools trained on historical crime data often replicate long-held biases, offering law enforcement, at best, a veneer of scientific legitimacy while perpetuating the over-policing of predominantly Black and Latino neighborhoods. An October headline from The Markup states bluntly: “Predictive Policing Software Terrible At Predicting Crimes.” The story recounts how researchers at the publication recently examined 23,631 police crime predictions—and found them accurate roughly 1 percent of the time.

    “Predictive policing systems rely on historical data distorted by falsified crime reports and disproportionate arrests of people of color,” Wyden and the other lawmakers wrote, predicting—as many researchers have—that the technology serves only to create “dangerous” feedback loops. The statement notes that “biased predictions are used to justify disproportionate stops and arrests in minority neighborhoods,” further biasing statistics on where crimes occur.

    Senators Jeffrey Merkley, Ed Markey, Alex Padilla, Peter Welch, and John Fetterman also cosigned the letter, as did Representative Yvette Clarke.

    The lawmakers have requested that an upcoming presidential report on policing and artificial intelligence investigate the use of predictive policing tools in the US. “The report should assess the accuracy and precision of predictive policing models across protected classes, their interpretability, and their validity,” to include, they added, “any limits on assessing their risks posed by a lack of transparency from the companies developing them.”

    Should the DOJ wish to continue funding the technology after this assessment, the lawmakers say, it should at least establish “evidence standards” to determine which predictive models are discriminatory—and then reject funding for all those that fail to live up to them.

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleDeath Stranding 2’s new trailer is delightfully disturbing
    Next Article Hulu is cracking down on password sharing, just like Disney Plus and Netflix

    Related Posts

    For Today’s Business Traveler, It’s All About Work-Life Integration

    July 3, 2025

    Affluent Travelers Are Ditching Business Class for Business Jets

    July 2, 2025

    Airplane Wi-Fi Is Now … Good?

    July 2, 2025

    Business Travel Is Evolving Faster Than Ever. We’ll Help You Navigate It

    July 2, 2025

    Airport Lounges Are Sexy Again—if You Can Get In

    July 2, 2025

    Business Class Ain’t What It Used to Be. Don’t Tell First Class

    July 2, 2025
    Our Picks

    How to Protest Safely in the Age of Surveillance

    July 3, 2025

    E Ink is turning the laptop touchpad into an e-reader for AI apps

    July 3, 2025

    US Supreme Court Upholds Texas Porn ID Law

    July 3, 2025

    A European Startup’s Spacecraft Made It to Orbit. Now It’s Lost at Sea

    July 3, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    Games

    ‘Persona 5: The Phantom X’ Brings the Series to Your Phone—and It’s Shockingly Good

    By News RoomJuly 3, 2025

    Persona games are herculean efforts to finish. A single playthrough of any game in the…

    Paramount Plus slashes prices to $2 for two months

    July 3, 2025

    Whoop MG review: a big whoop for a small crowd

    July 3, 2025

    Adding calendar events with a screenshot is AI at its finest

    July 3, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.