Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Israel is using an AI system to find targets in Gaza. Experts say it's just the start

Smoke billows after an Israeli strike on north Gaza on November 22, 2023. Israel says it is using artificial intelligence to find targets.
JOHN MACDOUGALL
/
AFP via Getty Images
Smoke billows after an Israeli strike on north Gaza on November 22, 2023. Israel says it is using artificial intelligence to find targets.

The pace is astonishing: In the wake of the brutal attacks by Hamas-led militants on October 7, Israeli forces have struck more than 22,000 targets inside Gaza, a small strip of land along the Mediterranean coast. Just since the temporary truce broke down on December 1, Israel's Air Force has hit more than 3,500 sites.

The Israeli military says it's using artificial intelligence to select many of these targets in real-time. The military claims that the AI system, named "the Gospel," has helped it to rapidly identify enemy combatants and equipment, while reducing civilian casualties.

But critics warn the system is unproven at best — and at worst, providing a technological justification for the killing of thousands of Palestinian civilians.

"It appears to be an attack aimed at maximum devastation of the Gaza Strip," says Lucy Suchman, an anthropologist and professor emeritus at Lancaster University in England who studies military technology. If the AI system is really working as claimed by Israel's military, "how do you explain that?" she asks.

Other experts question whether any AI can take on a job as consequential as targeting humans on the battlefield.

"AI algorithms are notoriously flawed with high error rates observed across applications that require precision, accuracy, and safety," warns Heidy Khlaaf, Engineering Director of AI Assurance at Trail of Bits, a technology security firm.

Sponsor Message

Despite these concerns, most experts agree that this is the beginning of a new phase in the use of AI in warfare. Algorithms can sift through mounds of intelligence data far faster than human analysts, says Robert Ashley, a former head of the U.S. Defense Intelligence Agency. Using AI to assist with targeting has the potential to give commanders an enormous edge.

"You're going to make decisions faster than your opponent, that's really what it's about," he says.

Here's what's known about the Gospel, and what it means for the future of warfare.

The AI targeting system Israel uses can generate targets at a rapid rate

Militaries all over the world have been experimenting with AI for more than a decade, according to Anthony King, professor of defense and security studies at the University of Exeter in England.

"The attraction is clear," he says. Most modern militaries are shrinking in size, and need technology to help bridge the gap. AI systems can help them search enormous quantities of intelligence data to try and find the enemy.

According to posts on the Israeli military's website, the Gospel was developed by Israel's signals intelligence branch, known as Unit 8200. The system is relatively new — one of the earliest mentions was a top innovation award that it won in 2020.

An Israel Defense Forces artillery unit fires towards Gaza near the border on December 11, 2023 in Southern Israel. Israel's AI system can be used to send targets to forces on land, air and sea.
Alexi J. Rosenfeld / Getty Images
/
Getty Images
An Israel Defense Forces artillery unit fires towards Gaza near the border on December 11, 2023 in Southern Israel. Israel's AI system can be used to send targets to forces on land, air and sea.

The Gospel is actually one of several AI programs being used by Israeli intelligence, according to Tal Mimran, a lecturer at Hebrew University in Jerusalem who has worked for the Israeli government on targeting during previous military operations. Other AI systems aggregate vast quantities of intelligence data and classify it. The final system is the Gospel, which makes a targeting recommendation to a human analyst. Those targets could be anything from individual fighters, to equipment like rocket launchers, or facilities such as Hamas command posts.

"Basically Gospel imitates what a group of intelligence officers used to do in the past," he says.

But the Gospel is much more efficient. He says a group of 20 officers might produce 50-100 targets in 300 days. By comparison, Mimran says he thinks the Gospel and its associated AI systems can suggest around 200 targets "within 10-12 days" — a rate that's at least 50 times faster.

Although it's not known exactly what data the Gospel uses to make its suggestions, it likely comes from a wide variety of different sources. The list includes things like cell phone messages, satellite imagery, drone footage and even seismic sensors, according to Blaise Misztal, vice president for policy at the Jewish Institute for National Security of America, a group that facilitates military cooperation between Israel and the United States.

Misztal's group documented one of the first trials of the Gospel, during a 2021 conflict in Gaza between Israel and the militant groups Hamas and Islamic Jihad. According to press reports and statements from the military itself, Israel used the Gospel and other AI programs to identify likely targets such as rocket launchers. The system was used to identify static targets as well as moving targets as they appeared on the battlefield. According to press reports, it identified around 200 targets in the conflict.

But it was not without its problems. The after-action report by Misztal's group noted that, while the AI had plenty of training data for what constituted a target, it lacked data on things that human analysts had decided were not targets. The Israeli military hadn't collected the target data its analysts had discarded, and as a result the system's training had been biased.

"It's been two years since then, so it's something that, hopefully, they've been able to rectify," Misztal says.

In this latest conflict, Israel is using its AI on a scale that hasn't been seen before.

Israel's latest military operation in Gaza began in response to the October 7 attack that killed roughly 1,200 people, according to the Israeli government. The military says that it is trying to eliminate the threat from Hamas and rescue hostages. It says Hamas has complicated the fight by using civilians as human shields and operating in tunnels under civilian areas.

Israel says that artificial intelligence is allowing it to precisely target Hamas infrastructure. Independent researchers say roughly one in three buildings in Gaza have been damaged or destroyed.
Yousef Masoud / AP
/
AP
Israel says that artificial intelligence is allowing it to precisely target Hamas infrastructure. Independent researchers say roughly one in three buildings in Gaza have been damaged or destroyed.

A brief blog post by the Israeli military on November 2 lays out how the Gospel is being used in the current conflict. According to the post, the military's Directorate of Targets is using the Gospel to rapidly produce targets based on the latest intelligence. The system provides a targeting recommendation for a human analyst who then decides whether to pass it along to soldiers in the field.

"This isn't just an automatic system," Misztal emphasizes. "If it thinks it finds something that could be a potential target, that's flagged then for an intelligence analyst to review."

The post states that the targeting division is able to send these targets to the air force and navy, and directly to ground forces via an app known as "Pillar of Fire," which commanders carry on military-issued smartphones and other devices.

King believes this conflict may be the first time that AI-generated targets are being rolled out on a large scale to try and influence a military operation. "This latest war in Gaza is going to proliferate this kind of technique out to the regular army," he says.

It's unclear exactly how many targets from the Gospel have been acted upon, but the Israeli military says it is currently striking as many as 250 targets a day.

Critics question whether the AI is performing as advertised

The Israeli military did not respond directly to NPR's inquiries about the Gospel. In the November 2 post, it said the system allows the military to "produce targets for precise attacks on infrastructures associated with Hamas, while causing great damage to the enemy and minimal harm to those not involved," according to an unnamed spokesperson.

But critics question whether the Gospel and other associated AI systems are in fact performing as the military claims. Khlaaf notes that artificial intelligence depends entirely on training data to make its decisions.

"The nature of AI systems is to provide outcomes based on statistical and probabilistic inferences and correlations from historical data, and not any type of reasoning, factual evidence, or 'causation,'" she says. "Given the track record of high error-rates of AI systems, imprecisely and biasedly automating targets is really not far from indiscriminate targeting."

Some accusations about the Gospel go further. A report by the Israeli publication +972 Magazine and the Hebrew-language outlet Local Call asserts that the system is being used to manufacture targets so that Israeli military forces can continue to bombard Gaza at an enormous rate, punishing the general Palestinian population.

People mourn as they collect the bodies of Palestinians killed in an airstrike on December 13, 2023 in Khan Yunis, Gaza. More than 18,000 Palestinians have been killed, the majority of whom are women and children.
Ahmad Hasaballah / Getty Images
/
Getty Images
People mourn as they collect the bodies of Palestinians killed in an airstrike on December 13, 2023 in Khan Yunis, Gaza. More than 18,000 Palestinians have been killed, the majority of whom are women and children.

NPR has not independently verified those claims, and it's unclear how many targets are currently being generated by AI alone. But there has been a substantial increase in targeting, according to the Israeli military's own numbers. In the 2021 conflict, Israel said it struck 1,500 targets in Gaza, approximately 200 of which came from the Gospel. Since October 7, the military says it has struck more than 22,000 targets inside Gaza — a daily rate more than double that of the 2021 conflict.

The toll on Palestinian civilians has been enormous. More than 18,000 Palestinians have died so far, the majority of whom are women and children, according to the Gaza health ministry. One in three buildings in Gaza have been damaged or destroyed, according to an independent analysis by Corey Scher of New York's CUNY Graduate Center and Jamon Van Den Hoek of Oregon State University.

The huge volume of targets is also likely putting pressure on the humans asked to review them, says Suchman. "In the face of this kind of acceleration, those reviews become more and more constrained in terms of what kind of judgment people can actually exercise," she says.

Mimran adds that, under pressure, analysts will be more likely to accept the AI's targeting recommendations, regardless of whether they are correct. Targeting officers may be tempted to think that "life will be much easier if we flow with the machine and accept its advice and recommendations," he says. But it could create a "whole new level of problems" if the machine is systematically misidentifying targets.

Finally, Khlaaf points out that the use of AI could make it more difficult to pursue accountability for those involved in the conflict. Although humans still retain the legal culpability for strikes, it's unclear who is responsible if the targeting system fails. Is it the analyst who accepted the AI recommendation? The programmers who made the system? The intelligence officers who gathered the training data?

"Given the lack of explainability of the decisions output by AI systems, and the sheer scale and complexity of these models, it then becomes impossible to trace decisions to specific design points that can hold any individual or military accountable," she says.

Despite the criticisms, the Israeli system is likely a taste of what's to come

While Israel's use of the Gospel to generate a full set of targets may be unique, the nation is hardly alone in using AI to assist in intelligence analysis. The U.S. is actively working with many different kinds of AI to try and identify targets in the field. One suite of AI tools, known as Project Maven, is run through the National Geospatial-Intelligence Agency, which collects massive quantities of satellite imagery — far more than a human analyst could search.

A 03 December satellite image shows Palestinian civilians sheltering around a college in the Gaza city of Khan Yunis. AI systems can analyze such imagery faster than a person.
/ Maxar Technologies
/
Maxar Technologies
A 03 December satellite image shows Palestinian civilians sheltering around a college in the Gaza city of Khan Yunis. AI systems can analyze such imagery faster than a person.

Ashley wouldn't comment on any particular AI tool used by the U.S. intelligence community, but he says often these systems will stitch together multiple layers of AI. Some excel at finding objects in images while others can sort through things like radio transmissions. Ashley says that like the Israeli system, the U.S. has human analysts and commanders making the final decisions about what to strike.

He believes other nations are working on similar systems. "You know the Russians are doing it, you know the Chinese are doing it," he says.

Ashley sees the proliferation of AI for this kind of work as inevitable, in part because it can be done with off-the-shelf commercial computer-vision algorithms. "Rather than using the algorithm to "find a particular widget, I'm looking for a tank or an anti-aircraft gun," he says. "It's dual-use technology."

King believes such targeting algorithms are an intermediate step for autonomous systems that will eventually be deployed to the battlefield. These robotic systems will likely be able to identify and kill targets without much or any human intervention. King says it's likely to make combat in the future even faster and deadlier, but the nature of war will remain the same.

"Human combat teams augmented by really lethal weapons to fight these hideous kinds of medieval fights," he says. "That's where I see warfare going."

NPR's Majd Al-Waheidi, Alon Avital, Greg Myre and Daniel Wood contributed to this report. contributed to this story

Copyright 2024 NPR. To see more, visit https://www.npr.org.

Tags
Geoff Brumfiel works as a senior editor and correspondent on NPR's science desk. His editing duties include science and space, while his reporting focuses on the intersection of science and national security.