International

[ International ] [ Main Menu ]


  


53606


Date: April 03, 2024 at 10:26:07
From: akira, [DNS_Address]
Subject: ‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

URL: https://www.972mag.com/lavender-ai-israeli-army-gaza/


The Lavender machine joins another AI system, “The Gospel,” about
which information was revealed in a previous investigation by +972 and Local
Call ...


excerpt, this is long

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza

The Israeli army has marked tens of thousands of Gazans as suspects for
assassination, using an AI targeting system with little human oversight and a
permissive policy for casualties, +972 and Local Call reveal.

‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
The Israeli army has marked tens of thousands of Gazans as suspects for
assassination, using an AI targeting system with little human oversight and a
permissive policy for casualties, +972 and Local Call reveal.
Yuval Abraham - April 3, 2024

In 2021, a book titled “The Human-Machine Team: How to Create Synergy
Between Human and Artificial Intelligence That Will Revolutionize Our World”
was released in English under the pen name “Brigadier General Y.S.” In it, the
author — a man who we confirmed to be the current commander of the elite
Israeli intelligence unit 8200 — makes the case for designing a special
machine that could rapidly process massive amounts of data to generate
thousands of potential “targets” for military strikes in the heat of a war. Such
technology, he writes, would resolve what he described as a “human
bottleneck for both locating the new targets and decision-making to approve
the targets.”

Such a machine, it turns out, actually exists. A new investigation by +972
Magazine and Local Call reveals that the Israeli army has developed an
artificial intelligence-based program known as “Lavender,” unveiled here for
the first time. According to six Israeli intelligence officers, who have all served
in the army during the current war on the Gaza Strip and had first-hand
involvement with the use of AI to generate targets for assassination,
Lavender has played a central role in the unprecedented bombing of
Palestinians, especially during the early stages of the war. In fact, according
to the sources, its influence on the military’s operations was such that they
essentially treated the outputs of the AI machine “as if it were a human
decision.”

Formally, the Lavender system is designed to mark all suspected operatives
in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including
low-ranking ones, as potential bombing targets. The sources told +972 and
Local Call that, during the first weeks of the war, the army almost completely
relied on Lavender, which clocked as many as 37,000 Palestinians as
suspected militants — and their homes — for possible air strikes.

During the early stages of the war, the army gave sweeping approval for
officers to adopt Lavender’s kill lists, with no requirement to thoroughly
check why the machine made those choices or to examine the raw
intelligence data on which they were based. One source stated that human
personnel often served only as a “rubber stamp” for the machine’s decisions,
adding that, normally, they would personally devote only about “20 seconds”
to each target before authorizing a bombing — just to make sure the
Lavender-marked target is male. This was despite knowing that the system
makes what are regarded as “errors” in approximately 10 percent of cases,
and is known to occasionally mark individuals who have merely a loose
connection to militant groups, or no connection at all.

Moreover, the Israeli army systematically attacked the targeted individuals
while they were in their homes — usually at night while their whole families
were present — rather than during the course of military activity. According
to the sources, this was because, from what they regarded as an intelligence
standpoint, it was easier to locate the individuals in their private houses.
Additional automated systems, including one called “Where’s Daddy?” also
revealed here for the first time, were used specifically to track the targeted
individuals and carry out bombings when they had entered their family’s
residences.

Palestinians transport the wounded and try to put out a fire after an Israeli
airstrike on a house in the Shaboura refugee camp in the city of Rafah,
southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90)
Palestinians transport the wounded and try to put out a fire after an Israeli
airstrike on a house in the Shaboura refugee camp in the city of Rafah,
southern Gaza Strip, November 17, 2023. (Abed Rahim Khatib/Flash90)
The result, as the sources testified, is that thousands of Palestinians — most
of them women and children or people who were not involved in the fighting
— were wiped out by Israeli airstrikes, especially during the first weeks of the
war, because of the AI program’s decisions.

“We were not interested in killing [Hamas] operatives only when they were in
a military building or engaged in a military activity,” A., an intelligence officer,
told +972 and Local Call. “On the contrary, the IDF bombed them in homes
without hesitation, as a first option. It’s much easier to bomb a family’s home.
The system is built to look for them in these situations.”

The Lavender machine joins another AI system, “The Gospel,” about which
information was revealed in a previous investigation by +972 and Local Call in
November 2023, as well as in the Israeli military’s own publications. A
fundamental difference between the two systems is in the definition of the
target: whereas The Gospel marks buildings and structures that the army
claims militants operate from, Lavender marks people — and puts them on a
kill list.

In addition, according to the sources, when it came to targeting alleged junior
militants marked by Lavender, the army preferred to only use unguided
missiles, commonly known as “dumb” bombs (in contrast to “smart”
precision bombs), which can destroy entire buildings on top of their
occupants and cause significant casualties. “You don’t want to waste
expensive bombs on unimportant people — it’s very expensive for the
country and there’s a shortage [of those bombs],” said C., one of the
intelligence officers. Another source said that they had personally authorized
the bombing of “hundreds” of private homes of alleged junior operatives
marked by Lavender, with many of these attacks killing civilians and entire
families as “collateral damage.”

In an unprecedented move, according to two of the sources, the army also
decided during the first weeks of the war that, for every junior Hamas
operative that Lavender marked, it was permissible to kill up to 15 or 20
civilians; in the past, the military did not authorize any “collateral damage”
during assassinations of low-ranking militants. The sources added that, in
the event that the target was a senior Hamas official with the rank of
battalion or brigade commander, the army on several occasions authorized
the killing of more than 100 civilians in the assassination of a single
commander.

Palestinians wait to receive the bodies of their relatives who were killed in an
Israeli airstrike, at Al-Najjar Hospital in Rafah, southern Gaza Strip, October
24, 2023. (Abed Rahim Khatib/Flash90)
Palestinians wait to receive the bodies of their relatives who were killed in an
Israeli airstrike, at Al-Najjar Hospital in Rafah, southern Gaza Strip, October
24, 2023. (Abed Rahim Khatib/Flash90)
The following investigation is organized according to the six chronological
stages of the Israeli army’s highly automated target production in the early
weeks of the Gaza war. First, we explain the Lavender machine itself, which
marked tens of thousands of Palestinians using AI. Second, we reveal the
“Where’s Daddy?” system, which tracked these targets and signaled to the
army when they entered their family homes. Third, we describe how “dumb”
bombs were chosen to strike these homes.

Fourth, we explain how the army loosened the permitted number of civilians
who could be killed during the bombing of a target. Fifth, we note how
automated software inaccurately calculated the amount of non-combatants
in each household. And sixth, we show how on several occasions, when a
home was struck, usually at night, the individual target was sometimes not
inside at all, because military officers did not verify the information in real
time.

STEP 1: GENERATING TARGETS
‘Once you go automatic, target generation goes crazy’
In the Israeli army, the term “human target” referred in the past to a senior
military operative who, according to the rules of the military’s International
Law Department, can be killed in their private home even if there are civilians
around. Intelligence sources told +972 and Local Call that during Israel’s
previous wars, since this was an “especially brutal” way to kill someone —
often by killing an entire family alongside the target — such human targets
were marked very carefully and only senior military commanders were
bombed in their homes, to maintain the principle of proportionality under
international law.

But after October 7 — when Hamas-led militants launched a deadly assault
on southern Israeli communities, killing around 1,200 people and abducting
240 — the army, the sources said, took a dramatically different approach.
Under “Operation Iron Swords,” the army decided to designate all operatives
of Hamas’ military wing as human targets, regardless of their rank or military
importance. And that changed everything.

The new policy also posed a technical problem for Israeli intelligence. In
previous wars, in order to authorize the assassination of a single human
target, an officer had to go through a complex and lengthy “incrimination”
process: cross-check evidence that the person was indeed a senior member
of Hamas’ military wing, find out where he lived, his contact information, and
finally know when he was home in real time. When the list of targets
numbered only a few dozen senior operatives, intelligence personnel could
individually handle the work involved in incriminating and locating them.

Palestinians try to rescue survivors and pull bodies from the rubble after
Israeli airstrikes hit buildings near Al-Aqsa Martyrs Hospital in Deir al-Balah,
central Gaza, October 22, 2023. (Mohammed Zaanoun)
Palestinians try to rescue survivors and pull bodies from the rubble after
Israeli airstrikes hit buildings near Al-Aqsa Martyrs Hospital in Deir al-Balah,
central Gaza, October 22, 2023. (Mohammed Zaanoun/Activestills)
However, once the list was expanded to include tens of thousands of lower-
ranking operatives, the Israeli army figured it had to rely on automated
software and artificial intelligence. The result, the sources testify, was that
the role of human personnel in incriminating Palestinians as military
operatives was pushed aside, and AI did most of the work instead. According
to four of the sources who spoke to +972 and Local Call, Lavender — which
was developed to create human targets in the current war — has marked
some 37,000 Palestinians as suspected “Hamas militants,” most of them
junior, for assassination (the IDF Spokesperson denied the existence of such
a kill list in a statement to +972 and Local Call).

“We didn’t know who the junior operatives were, because Israel didn’t track
them routinely [before the war],” explained senior officer B. to +972 and
Local Call, illuminating the reason behind the development of this particular
target machine for the current war. “They wanted to allow us to attack [the
junior operatives] automatically. That’s the Holy Grail. Once you go
automatic, target generation goes crazy.”

The sources said that the approval to automatically adopt Lavender’s kill
lists, which had previously been used only as an auxiliary tool, was granted
about two weeks into the war, after intelligence personnel “manually”
checked the accuracy of a random sample of several hundred targets
selected by the AI system. When that sample found that Lavender’s results
had reached 90 percent accuracy in identifying an individual’s affiliation with
Hamas, the army authorized the sweeping use of the system. From that
moment, sources said that if Lavender decided an individual was a militant in
Hamas, they were essentially asked to treat that as an order, with no
requirement to independently check why the machine made that choice or to
examine the raw intelligence data on which it is based.

“At 5 a.m., [the air force] would come and bomb all the houses that we had
marked,” B. said. “We took out thousands of people. We didn’t go through
them one by one — we put everything into automated systems, and as soon
as one of [the marked individuals] was at home, he immediately became a
target. We bombed him and his house.”

“It was very surprising for me that we were asked to bomb a house to kill a
ground soldier, whose importance in the fighting was so low,” said one
source about the use of AI to mark alleged low-ranking militants. “I
nicknamed those targets ‘garbage targets.’ Still, I found them more ethical
than the targets that we bombed just for ‘deterrence’ — highrises that are
evacuated and toppled just to cause destruction.”

The deadly results of this loosening of restrictions in the early stage of the
war were staggering. According to data from the Palestinian Health Ministry
in Gaza, on which the Israeli army has relied almost exclusively since the
beginning of the war, Israel killed some 15,000 Palestinians — almost half of
the death toll so far — in the first six weeks of the war, up until a week-long
ceasefire was agreed on Nov. 24.

Massive destruction is seen in Al-Rimal popular district of Gaza City after it
was targeted by airstrikes carried out by Israeli colonial, October 10, 2023.
(Mohammed Zaanoun)
Massive destruction is seen in Al-Rimal popular district of Gaza City after it
was targeted by airstrikes carried out by Israeli colonial, October 10, 2023.
(Mohammed Zaanoun)
‘The more information and variety, the better’
The Lavender software analyzes information collected on most of the 2.3
million residents of the Gaza Strip through a system of mass surveillance,
then assesses and ranks the likelihood that each particular person is active in
the military wing of Hamas or PIJ. According to sources, the machine gives
almost every single person in Gaza a rating from 1 to 100, expressing how
likely it is that they are a militant.

Lavender learns to identify characteristics of known Hamas and PIJ
operatives, whose information was fed to the machine as training data, and
then to locate these same characteristics — also called “features” — among
the general population, the sources explained. An individual found to have
several different incriminating features will reach a high rating, and thus
automatically becomes a potential target for assassination.

In “The Human-Machine Team,” the book referenced at the beginning of this
article, the current commander of Unit 8200 advocates for such a system
without referencing Lavender by name. (The commander himself also isn’t
named, but five sources in 8200 confirmed that the commander is the
author, as reported also by Haaretz.) Describing human personnel as a
“bottleneck” that limits the army’s capacity during a military operation, The
commander laments: “We [humans] cannot process so much information. It
doesn’t matter how many people you have tasked to produce targets during
the war — you still cannot produce enough targets per day.”

The solution to this problem, he says, is artificial intelligence. The book offers
a short guide to building a “target machine,” similar in description to
Lavender, based on AI and machine-learning algorithms. Included in this
guide are several examples of the “hundreds and thousands” of features that
can increase an individual’s rating, such as being in a Whatsapp group with a
known militant, changing cell phone every few months, and changing
addresses frequently.

“The more information, and the more variety, the better,” the commander
writes. “Visual information, cellular information, social media connections,
battlefield information, phone contacts, photos.” While humans select these
features at first, the commander continues, over time the machine will come
to identify features on its own. This, he says, can enable militaries to create
“tens of thousands of targets,” while the actual decision as to whether or not
to attack them will remain a human one.

The book isn’t the only time a senior Israeli commander hinted at the
existence of human target machines like Lavender. +972 and Local Call have
obtained footage of a private lecture given by the commander of Unit 8200’s
secretive Data Science and AI center, “Col. Yoav,” at Tel Aviv University’s AI
week in 2023, which was reported on at the time in the Israeli media.

In the lecture, the commander speaks about a new, sophisticated target
machine used by the Israeli army that detects “dangerous people” based on
their likeness to existing lists of known militants on which it was trained.
“Using the system, we managed to identify Hamas missile squad
commanders,” Col. Yoav said in the lecture, referring to Israel’s May 2021
military operation in Gaza, when the machine was used for the first time.

The lecture presentation slides, also obtained by +972 and Local Call,
contain illustrations of how the machine works: it is fed data about existing
Hamas operatives, it learns to notice their features, and then it rates other
Palestinians based on how similar they are to the militants.

“We rank the results and determine the threshold [at which to attack a
target],” Col. Yoav said in the lecture, emphasizing that “eventually, people of
flesh and blood take the decisions. In the defense realm, ethically speaking,
we put a lot of emphasis on this. These tools are meant to help [intelligence
officers] break their barriers.”

In practice, however, sources who have used Lavender in recent months say
human agency and precision were substituted for mass target creation and
lethality.

‘There was no “zero-error” policy’
B., a senior officer who used Lavender, echoed to +972 and Local Call that in
the current war, officers were not required to independently review the AI
system’s assessments, in order to save time and enable the mass production
of human targets without hindrances.

“Everything was statistical, everything was neat — it was very dry,” B. said.
He noted that this lack of supervision was permitted despite internal checks
showing that Lavender’s calculations were considered accurate only 90
percent of the time; in other words, it was known in advance that 10 percent
of the human targets slated for assassination were not members of the
Hamas military wing at all.

For example, sources explained that the Lavender machine sometimes
mistakenly flagged individuals who had communication patterns similar to
known Hamas or PIJ operatives — including police and civil defense workers,
militants’ relatives, residents who happened to have a name and nickname
identical to that of an operative, and Gazans who used a device that once
belonged to a Hamas operative.

“How close does a person have to be to Hamas to be [considered by an AI
machine to be] affiliated with the organization?” said one source critical of
Lavender’s inaccuracy. “It’s a vague boundary. Is a person who doesn’t
receive a salary from Hamas, but helps them with all sorts of things, a Hamas
operative? Is someone who was in Hamas in the past, but is no longer there
today, a Hamas operative? Each of these features — characteristics that a
machine would flag as suspicious — is inaccurate.”

Similar problems exist with the ability of target machines to assess the phone
used by an individual marked for assassination. “In war, Palestinians change
phones all the time,” said the source. “People lose contact with their families,
give their phone to a friend or a wife, maybe lose it. There is no way to rely
100 percent on the automatic mechanism that determines which [phone]
number belongs to whom.”

According to the sources, the army knew that the minimal human supervision
in place would not discover these faults. “There was no ‘zero-error’ policy.
Mistakes were treated statistically,” said a source who used Lavender.
“Because of the scope and magnitude, the protocol was that even if you
don’t know for sure that the machine is right, you know that statistically it’s
fine. So you go for it.”

“It has proven itself,” said B., the senior source. “There’s something about the
statistical approach that sets you to a certain norm and standard. There has
been an illogical amount of [bombings] in this operation. This is unparalleled,
in my memory. And I have much more trust in a statistical mechanism than a
soldier who lost a friend two days ago. Everyone there, including me, lost
people on October 7. The machine did it coldly. And that made it easier.”

Another intelligence source, who defended the reliance on the Lavender-
generated kill lists of Palestinian suspects, argued that it was worth investing
an intelligence officer’s time only to verify the information if the target was a
senior commander in Hamas. “But when it comes to a junior militant, you
don’t want to invest manpower and time in it,” he said. “In war, there is no
time to incriminate every target. So you’re willing to take the margin of error
of using artificial intelligence, risking collateral damage and civilians dying,
and risking attacking by mistake, and to live with it.”

B. said that the reason for this automation was a constant push to generate
more targets for assassination. “In a day without targets [whose feature
rating was sufficient to authorize a strike], we attacked at a lower threshold.
We were constantly being pressured: ‘Bring us more targets.’ They really
shouted at us. We finished [killing] our targets very quickly.”

He explained that when lowering the rating threshold of Lavender, it would
mark more people as targets for strikes. “At its peak, the system managed to
generate 37,000 people as potential human targets,” said B. “But the
numbers changed all the time, because it depends on where you set the bar
of what a Hamas operative is. There were times when a Hamas operative was
defined more broadly, and then the machine started bringing us all kinds of
civil defense personnel, police officers, on whom it would be a shame to
waste bombs. They help the Hamas government, but they don’t really
endanger soldiers.” CONTINUES...


Responses:
[53608] [53612] [53607]


53608


Date: April 03, 2024 at 16:19:49
From: shatterbrain, [DNS_Address]
Subject: Re: ‘Lavender’: The AI machine directing Israel’s bombing spree in...

URL: Why the Future Doesn't Need Us


The prophetic words of Bill Joy is coming to pass.


Responses:
[53612]


53612


Date: April 03, 2024 at 17:49:26
From: akira, [DNS_Address]
Subject: Re: ‘Lavender’: The AI machine directing Israel’s bombing spree in...

URL: https://dn720002.ca.archive.org/0/items/why-the-future-doesnt-need-us-wired/Why%20the%20Future%20Doesn%27t%20Need%20Us%20_%20WIRED.pdf


interesting read, but I had to find another source since Wired wouldn't allow
access to the article for more than 30 seconds. What do they think, I'm a
robot? Here's a pdf version .


Responses:
None


53607


Date: April 03, 2024 at 10:33:00
From: akira, [DNS_Address]
Subject: AI marks 37,000 Gazans for assassination, with little human oversight(NT)


(NT)


Responses:
None


[ International ] [ Main Menu ]

Generated by: TalkRec 1.17
    Last Updated: 30-Aug-2013 14:32:46, 80837 Bytes
    Author: Brian Steele