Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot
    Chrome can now autofill details from your Google account

    Chrome can now autofill details from your Google account

    December 5, 2025
    Wikipedia is getting in on the yearly wrapped game

    Wikipedia is getting in on the yearly wrapped game

    December 5, 2025
    OpenAI’s GPT-5.2 ‘code red’ response to Google is coming next week

    OpenAI’s GPT-5.2 ‘code red’ response to Google is coming next week

    December 5, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Cops Used DNA to Predict a Suspect’s Face—and Tried to Run Facial Recognition on It
    Security

    Cops Used DNA to Predict a Suspect’s Face—and Tried to Run Facial Recognition on It

    News RoomBy News RoomJanuary 25, 20243 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email
    Cops Used DNA to Predict a Suspect’s Face—and Tried to Run Facial Recognition on It

    In 2017, detectives working a cold case at the East Bay Regional Park District Police Department got an idea, one that might help them finally get a lead on the murder of Maria Jane Weidhofer. Officers had found Weidhofer, dead and sexually assaulted, at Berkeley, California’s Tilden Regional Park in 1990. Nearly 30 years later, the department sent genetic information collected at the crime scene to Parabon NanoLabs—a company that says it can turn DNA into a face.

    Parabon NanoLabs ran the suspect’s DNA through its proprietary machine learning model. Soon, it provided the police department with something the detectives had never seen before: the face of a potential suspect, generated using only crime scene evidence.

    The image Parabon NanoLabs produced, called a Snapshot Phenotype Report, wasn’t a photograph. It was a 3D rendering that bridges the uncanny valley between reality and science fiction; a representation of how the company’s algorithm predicted a person could look given genetic attributes found in the DNA sample.

    The face of the murderer, the company predicted, was male. He had fair skin, brown eyes and hair, no freckles, and bushy eyebrows. A forensic artist employed by the company photoshopped a nondescript, close-cropped haircut onto the man and gave him a mustache—an artistic addition informed by a witness description and not the DNA sample.

    In a controversial 2017 decision, the department published the predicted face in an attempt to solicit tips from the public. Then, in 2020, one of the detectives did something civil liberties experts say is even more problematic—and a violation of Parabon NanoLabs’ terms of service: He asked to have the rendering run through facial recognition software.

    “Using DNA found at the crime scene, Parabon Labs reconstructed a possible suspect’s facial features,” the detective explained in a request for “analytical support” sent to the Northern California Regional Intelligence Center, a so-called fusion center that facilitates collaboration among federal, state, and local police departments. “I have a photo of the possible suspect and would like to use facial recognition technology to identify a suspect/lead.”

    The detective’s request to run a DNA-generated estimation of a suspect’s face through facial recognition tech has not previously been reported. Found in a trove of hacked police records published by the transparency collective Distributed Denial of Secrets, it appears to be the first known instance of a police department attempting to use facial recognition on a face algorithmically generated from crime-scene DNA.

    It likely won’t be the last.

    For facial recognition experts and privacy advocates, the East Bay detective’s request, while dystopian, was also entirely predictable. It emphasizes the ways that, without oversight, law enforcement is able to mix and match technologies in unintended ways, using untested algorithms to single out suspects based on unknowable criteria.

    “It’s really just junk science to consider something like this,” Jennifer Lynch, general counsel at civil liberties nonprofit the Electronic Frontier Foundation, tells WIRED. Running facial recognition with unreliable inputs, like an algorithmically generated face, is more likely to misidentify a suspect than provide law enforcement with a useful lead, she argues. “There’s no real evidence that Parabon can accurately produce a face in the first place,” Lynch says. “It’s very dangerous, because it puts people at risk of being a suspect for a crime they didn’t commit.”

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleMost Top News Sites Block AI Bots. Right-Wing Media Welcomes Them
    Next Article Instagram and Facebook will now prevent strangers from messaging minors by default

    Related Posts

    The Louisiana Department of Wildlife and Fisheries Is Detaining People for ICE

    The Louisiana Department of Wildlife and Fisheries Is Detaining People for ICE

    December 5, 2025
    Your Data Might Determine How Much You Pay for Eggs

    Your Data Might Determine How Much You Pay for Eggs

    December 4, 2025
    Russia Wants This Mega Missile to Intimidate the West, but It Keeps Crashing

    Russia Wants This Mega Missile to Intimidate the West, but It Keeps Crashing

    December 4, 2025
    This Hacker Conference Installed a Literal Antivirus Monitoring System

    This Hacker Conference Installed a Literal Antivirus Monitoring System

    December 4, 2025
    Flock Uses Overseas Gig Workers to Build Its Surveillance AI

    Flock Uses Overseas Gig Workers to Build Its Surveillance AI

    December 4, 2025
    The WIRED Guide to Digital Opsec for Teens

    The WIRED Guide to Digital Opsec for Teens

    December 2, 2025
    Our Picks
    Wikipedia is getting in on the yearly wrapped game

    Wikipedia is getting in on the yearly wrapped game

    December 5, 2025
    OpenAI’s GPT-5.2 ‘code red’ response to Google is coming next week

    OpenAI’s GPT-5.2 ‘code red’ response to Google is coming next week

    December 5, 2025
    The Apple Airpods 4 with ANC are at their lowest price

    The Apple Airpods 4 with ANC are at their lowest price

    December 5, 2025
    Meta Poached Apple’s Top Design Guys to Fix Its Software UI

    Meta Poached Apple’s Top Design Guys to Fix Its Software UI

    December 5, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    This tiny magnetic e-reader sticks to the back of your iPhone News

    This tiny magnetic e-reader sticks to the back of your iPhone

    By News RoomDecember 5, 2025

    A Chinese company named Xteink has announced the X4, a compact e-reader with a 4.3-inch…

    Pluribus turns a ‘caloric deficit’ into a nightmare

    Pluribus turns a ‘caloric deficit’ into a nightmare

    December 5, 2025
    This Unique Translator Gets Bogged Down by Half-Baked Features

    This Unique Translator Gets Bogged Down by Half-Baked Features

    December 5, 2025
    EU fines X 0 million over ‘deceptive’ blue checkmarks

    EU fines X $140 million over ‘deceptive’ blue checkmarks

    December 5, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.