Close Menu
Technology Mag

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Most Cheap Laptops Only Last a Few Years. The Framework Laptop 12 Could Last a Decade

    June 21, 2025

    Final Fantasy fans, now is the time to get into Magic: The Gathering

    June 21, 2025

    Gear News This Week: Adobe Wants to Make iPhone Photos Better, and TCL Brings Flexibility to Atmos

    June 21, 2025
    Facebook X (Twitter) Instagram
    Subscribe
    Technology Mag
    Facebook X (Twitter) Instagram YouTube
    • Home
    • News
    • Business
    • Games
    • Gear
    • Reviews
    • Science
    • Security
    • Trending
    • Press Release
    Technology Mag
    Home » Report: Israel used AI to identify bombing targets in Gaza
    News

    Report: Israel used AI to identify bombing targets in Gaza

    News RoomBy News RoomApril 4, 20246 Mins Read
    Facebook Twitter Pinterest LinkedIn Reddit WhatsApp Email

    Israel’s military has been using artificial intelligence to help choose its bombing targets in Gaza, sacrificing accuracy in favor of speed and killing thousands of civilians in the process, according to an investigation by Israel-based publications +972 Magazine and Local Call.

    The system, called Lavender, was developed in the aftermath of Hamas’ October 7th attacks, the report claims. At its peak, Lavender marked 37,000 Palestinians in Gaza as suspected “Hamas militants” and authorized their assassinations.

    Israel’s military denied the existence of such a kill list in a statement to +972 and Local Call. A spokesperson told CNN that AI was not being used to identify suspected terrorists but did not dispute the existence of the Lavender system, which the spokesperson described as “merely tools for analysts in the target identification process.” Analysts “must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in IDF directives,” the spokesperson told CNN. The Israel Defense Forces did not immediately respond to The Verge’s request for comment.

    In interviews with +972 and Local Call, however, Israeli intelligence officers said they weren’t required to conduct independent examinations of the Lavender targets before bombing them but instead effectively served as “a ‘rubber stamp’ for the machine’s decisions.” In some instances, officers’ only role in the process was determining whether a target was male. 

    Choosing targets

    To build the Lavender system, information on known Hamas and Palestinian Islamic Jihad operatives was fed into a dataset — but, according to one source who worked with the data science team that trained Lavender, so was data on people loosely affiliated with Hamas, such as employees of Gaza’s Internal Security Ministry. “I was bothered by the fact that when Lavender was trained, they used the term ‘Hamas operative’ loosely, and included people who were civil defense workers in the training dataset,” the source told +972. 

    Lavender was trained to identify “features” associated with Hamas operatives, including being in a WhatsApp group with a known militant, changing cellphones every few months, or changing addresses frequently. That data was then used to rank other Palestinians in Gaza on a 1–100 scale based on how similar they were to the known Hamas operatives in the initial dataset. People who reached a certain threshold were then marked as targets for strikes. That threshold was always changing “because it depends on where you set the bar of what a Hamas operative is,” one military source told +972.

    The system had a 90 percent accuracy rate, sources said, meaning that about 10 percent of the people identified as Hamas operatives weren’t members of Hamas’ military wing at all. Some of the people Lavender flagged as targets just happened to have names or nicknames identical to those of known Hamas operatives; others were Hamas operatives’ relatives or people who used phones that had once belonged to a Hamas militant. “Mistakes were treated statistically,” a source who used Lavender told +972. “Because of the scope and magnitude, the protocol was that even if you don’t know for sure that the machine is right, you know statistically that it’s fine. So you go for it.”

    Collateral damage

    Intelligence officers were given wide latitude when it came to civilian casualties, sources told +972. During the first few weeks of the war, officers were allowed to kill up to 15 or 20 civilians for every lower-level Hamas operative targeted by Lavender; for senior Hamas officials, the military authorized “hundreds” of collateral civilian casualties, the report claims. 

    Suspected Hamas operatives were also targeted in their homes using a system called “Where’s Daddy?” officers told +972. That system put targets generated by Lavender under ongoing surveillance, tracking them until they reached their homes — at which point, they’d be bombed, often alongside their entire families, officers said. At times, however, officers would bomb homes without verifying that the targets were inside, wiping out scores of civilians in the process. “It happened to me many times that we attacked a house, but the person wasn’t even home,” one source told +972. “The result is that you killed a family for no reason.”

    AI-driven warfare

    Mona Shtaya, a non-resident fellow at the Tahrir Institute for Middle East Policy, told The Verge that the Lavender system is an extension of Israel’s use of surveillance technologies on Palestinians in both the Gaza Strip and the West Bank. 

    Shtaya, who is based in the West Bank, told The Verge that these tools are particularly troubling in light of reports that Israeli defense startups are hoping to export their battle-tested technology abroad. 

    Since Israel’s ground offensive in Gaza began, the Israeli military has relied on and developed a host of technologies to identify and target suspected Hamas operatives. In March, The New York Times reported that Israel deployed a mass facial recognition program in the Gaza Strip — creating a database of Palestinians without their knowledge or consent — which the military then used to identify suspected Hamas operatives. In one instance, the facial recognition tool identified Palestinian poet Mosab Abu Toha as a suspected Hamas operative. Abu Toha was detained for two days in an Israeli prison, where he was beaten and interrogated before being returned to Gaza.

    Another AI system, called “The Gospel,” was used to mark buildings or structures that Hamas is believed to operate from. According to a +972 and Local Call report from November, The Gospel also contributed to vast numbers of civilian casualties. “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed — that it was a price worth paying in order to hit [another] target,” a military source told the publications at the time.

    “We need to look at this as a continuation of the collective punishment policies that have been weaponized against Palestinians for decades now,” Shtaya said. “We need to make sure that war times are not used to justify the mass surveillance and mass killing of people, especially civilians, in places like Gaza.”

    Share. Facebook Twitter Pinterest LinkedIn WhatsApp Reddit Email
    Previous ArticleThe best robot vacuums you can buy
    Next Article Here’s How Generative AI Depicts Queer People

    Related Posts

    Final Fantasy fans, now is the time to get into Magic: The Gathering

    June 21, 2025

    The music industry is building the tech to hunt down AI songs

    June 21, 2025

    Inside the courthouse reshaping the future of the internet

    June 21, 2025

    Meta held talks to buy Thinking Machines, Perplexity, and Safe Superintelligence

    June 20, 2025

    The best Apple deals you can shop ahead of Amazon Prime Day

    June 20, 2025

    You sound like ChatGPT

    June 20, 2025
    Our Picks

    Final Fantasy fans, now is the time to get into Magic: The Gathering

    June 21, 2025

    Gear News This Week: Adobe Wants to Make iPhone Photos Better, and TCL Brings Flexibility to Atmos

    June 21, 2025

    The Mysterious Inner Workings of Io, Jupiter’s Volcanic Moon

    June 21, 2025

    The music industry is building the tech to hunt down AI songs

    June 21, 2025
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo
    Don't Miss
    Gear

    Meta’s Oakley Smart Glasses Have 3K Video—Watch Out, Ray-Ban

    By News RoomJune 21, 2025

    When Meta launched the second generation of its Ray-Ban Meta smart glasses in late 2023,…

    Inside the courthouse reshaping the future of the internet

    June 21, 2025

    The EPA Plans to ‘Reconsider’ Ban on Cancer-Causing Asbestos

    June 21, 2025

    The Radeon RX 9060 XT Is a Great Affordable Video Card for Gamers

    June 21, 2025
    Facebook X (Twitter) Instagram Pinterest
    • Privacy Policy
    • Terms of use
    • Advertise
    • Contact
    © 2025 Technology Mag. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.