YTread Logo
YTread Logo

Lavender & Where's Daddy: How Israel Used AI to Form Kill Lists & Bomb Palestinians in Their Homes

Apr 11, 2024
this is democracy now democraticnow.org the warren peace report i'm amy goodman

israel

i publications plus 972 magazine and a local call have exposed how the

israel

i military

used

artificial intelligence known as

lavender

to develop a

kill

list in Gaza that includes As many as 37,000 Palestinians are targeted for assassination with little human supervision. The report is based in part on interviews with six Israeli intelligence officers who were involved firsthand with the artificial intelligence system. 972 reports cite that

lavender

has played a central role in the unprecedented

bomb

ing of the Palestinians, especially during In fact, in the early stages of the war, according to sources, its influence on military operations was such that they essentially treated the results of the AI ​​machine as if it were a human decision.
lavender where s daddy how israel used ai to form kill lists bomb palestinians in their homes
A second AI system known as We's Daddy tracks the Palestinian men on the

kill

list was designed specifically to help Israel. Attacking people when they were home at night with

their

families. One intelligence officer told Publications that we were not interested in killing agents only when they were in a military building or participating in a military activity; On the contrary, the IDF

bomb

ed

their

homes

without hesitation as their first option. It's much easier to bomb a family home. The system is designed to look for them in these situations. They said today we spent the hour with Israeli investigative journalist Yuval Abraham, who broke the story. 97 a on local call is titled lavender the artificial intelligence machine directing Israel's bombing of Gaza I spoke to all of you Abraham yesterday and I started by asking him to lay out what he found yeah, thanks for having me back Amy um, it's a very long uh piece it's 8,000 words um and we break it down into six different steps and each step represents a process in the highly automated way uh that the military um marked uh targets um since October and the first find is lavender so lavender. was designed by the military, its purpose was when it was being designed to mark low-ranking operatives in the military wings of Hamas and Islamic Jihad that that was the intention because you know Israel estimates that there are between 30 and 40,000 Hamas operatives and It is a very large number and they understood that the only way to mark these people is by trusting artificial intelligence and that was the intention.
lavender where s daddy how israel used ai to form kill lists bomb palestinians in their homes

More Interesting Facts About,

lavender where s daddy how israel used ai to form kill lists bomb palestinians in their homes...

Now, what sources told me is that after October 7, the military basically made the decision that all of these dozens. of thousands of people are now people who could potentially be bombed inside their

homes

, which would mean not only killing them but everyone who was in the building, the children, the families, um, and they understood that to try to do it, they would have to do it. trust this artificial intelligence machine called lavender with minimal human supervision. I mean, one source said he felt like he was acting as a rubber stamp on the machine's decisions, now what Lavender does is scan in

form

ation on probably 90% of the population of Gaza, so we're talking about you know to over a million people and gives each individual a score between one and 100, a score that is an expression of how likely the machine is to think based on a list of small features, and we can get to that later. . that this individual is a member of the military wings of Hamas or Islamic Jihad, sources told me that the military knew this because they verified that they took a random sample and verified one by one that the military knew that approximately 10% of the people those that the machine was marking The murdered were not Hamas militants, they were not, some of them had a vague connection with Hamas, others had no connection with Hamas at all.
lavender where s daddy how israel used ai to form kill lists bomb palestinians in their homes
I mean, one source said that the machine would bring in people who had exactly the same name and nickname as a Hamas operative or people. that they had similar communication profiles, as if they could be civil defense workers, police officers in Gaza and that they again implemented minimal supervision on the machine. One source said he spent 20 seconds per target before authorizing bombing of the suspected low-ranking Hamas militant often. It would have also been a civilian um um um killing those people inside their homes and I think this is the reliance on artificial intelligence here to mark those targets um and basically the deadly way that the officers talked about how they were using the machine .
lavender where s daddy how israel used ai to form kill lists bomb palestinians in their homes
It could well be part of the reason why, in the first six weeks after October 7, one of the main features of the policies that were implemented was that entire Palestinian families were annihilated inside their homes. I mean, if you look according to UN statistics, more than 50% of the victims, more than 6,000 people at that time, came from a smaller group of families. It's an expression that you know the family unit is being destroyed and I think that machine and the way it was.

used

um um um led to that, you talk about targeting and you talk about the so-called high value targets, uh, Hamas commanders and then the lower level fighters, um, uh, and like you said, a lot of them in the end They weren't either, but explain the buildings that were attacked and the bombs that were used to attack them, yes, yes, it's a good question, so what the sources told me is that during those first weeks after October, for the militants low-ranking members of Hamas, many of whom were flagged. by lavender, so we can say that the suspected militants who were marked by the machine had a predetermined degree of collateral damage and this means that the military's Department of International Law told these intelligence officers that for every low-ranking target they the lavender marks when bombing that target, they are allowed to kill one source said the number was up to 20 civilians again for any Hamas operative, regardless of their rank, regardless of their importance, regardless of their age.
One source said there were also minors being marked, not many of them, but he said it was a possibility that there was no age limit and another source said the limit was up to 15 civilians for low-ranking militants, the sources said that for um senior commanders in Hamas, then they could be um, you know, commanders. of brigades o o o o o divisions or battalions um the numbers were for the first time in the history of the IDF in the three digits according to sources like this for example Iman Noel who was the Hamas commander of the Central Brigade um a source who took part in the attack against that person said that the army authorized him to kill 300 Palestinian civilians along with that person and we have spoken on more than 972 local calls with Palestinians who witnessed that attack and and and and and they talk about you know four quite large residential buildings were bombed that day, You know, entire apartments full of families were bombed and killed and that source told me that this was this, this is not, this is not a mistake like the number of civilians in this. 300 civilians were known in advance to the Israeli army and sources described it to me and said I mean.
A source said that during those weeks at the beginning, indeed, the principle of proportionality, as they call it according to international law, did not exist, so there are two programs, there is lavender and

where

is

daddy

. How did they even know

where

these men were innocent or not? Yes, the way the system was designed is that there is this concept in general in mass surveillance systems called linking when you want to automate these systems. you want to be able to do it very quickly, you know you get, for example, an ID from a person and he wants to have a computer so he can very quickly link that ID to other things and what sources told me is that since everyone in Gaza has a home as a house or at least that was the case in the past.
The system was designed to be able to automatically link between individuals and households and in most cases these households are linked to the people Lavender is marked as low-ranking militants - they are not places where active military action is taking place according to the sources, but the way the system was designed and programs like Where's Dad? that were designed to search for these low-ranking militants when they enter homes. specifically sends an alert to intelligence officers when when um these these these these marked AIs um um suspects enter their homes um the system was designed in a way that allowed the Israeli army to carry out massive attacks against Palestinians sometimes militants sometimes suspected militants who We don't know when they were in these spaces in these houses and the sources said that you know that CNN reported in December that, according to US intelligence assessments, 45% of the munitions that Israel dropped on Gaza were so-called bombs unguided fools that Do you know that major damage to civilians destroys the entire structure and the sources said that for these low-ranking agents in Hamas they were just using um, the silly munitions, which means they were knocking down everyone's houses those who were inside and when you ask the intelligence officers why an explanation? what they give is that these people were not important, they were not important enough from a military perspective for the Israeli army to say that it would waste expensive munitions, i.e. more guided floor bombs that perhaps could have taken just one particular floor of the building and to me that was very surprising because you know that you are dropping a bomb on a house and killing entire families, but the target that you intend to assassinate by doing so is not considered important enough to waste an expensive bomb on um and I think that It is a very rare reflection of the way the Israeli military measures the value of Palestinian lives in relation to the expected military gain, which is the principle of proportionality, and I think one thing was very, very clear in all the sources that what I talked to is you know this was this, they said it was, it was psychologically shocking even for them, you know, that's how it was, yeah, like that, like that, like that, that's the combination uh uh between lavender and and and and where is

daddy

? lavender

lists

are entered where daddy is and these systems track suspects and wait for the moment when they enter houses, usually family houses or homes where no military action is taking place, according to several sources who did this and who told me they talked about this um and These houses are bombed with unguided missiles.
This was a major feature of Israeli policy in Gaza, at least during the first weeks. You're right. They said they didn't have as many smart bombs, they were more expensive, so they had them. I don't want to waste them, so they use dumb bombs that kill a lot more, yeah, exactly, that's what they said, but then I say if the person is, you know, it's not important enough for you to waste ammo, but you . re but you are willing to kill 15 civilians per family yal Abraham I wanted to read from the Israeli military statement the IDF statement in response to your report they say quote the military target identification process in the IDF consists of various types of tools and methods which include in

form

ation management tools that are used to help intelligence analysts optimally collect and analyze intelligence obtained from a variety of sources, contrary to claims that the IDF does not use an artificial intelligence system that identifies to terrorist agents or try to predict if a person is a terrorist information systems are simply tools for analysts in the process of identifying the target again, that is the idea of ​​the answer, all of you, Abraham, to your report, your answer.
I read this response to some of the sources and they said they are lying. It's not true and I was surprised that they were, you know, they usually aren't, so you're being shameless to say something that's false. I think this can be refuted very easily because you know a high-ranking Israeli military officer. head of the 8200 unit AI Center uh um gave a public lecture last year in 2023 at Tel Aviv University, you can Google anyone listening to us where he talked about quoting him, I'm quoting him at that lecture, a system of AI that the Israeli military used it in 2021 to find terrorists, that's what it said, so to have that on record, have the presentation slides that show how the system is rating people and then get feedback from the idea of ​​the spokesperson who says yes.
I don't have a system that uses AI. I really don't know, I almost thought: did I put this in the piece or not because you know you like it? I know, no. In the end, you know, I gave them the space in the piece to make those statements like I think I try to be as dry as possible in the way that I was reporting and and but but but but I really have a lot of confidence in those fights, they're verified by numerous sources that I have spoken with and I believe that the people who read the researchcomplete they read the depth of it.
The commander of unit 00 wrote a book in 20121 titled Human Machine Teams: How the Synergy Between Ai and Humans Can Revolutionize the World and in his book he talks about how the military should rely on artificial intelligence to, quote , solve the problem of the human bottleneck in creating new targets and in making decisions to approve new targets and wrote In that book he said and this is another quote from He said that no matter how many intelligence officers have been assigned the task of producing targets during the war, they still won't be able to produce enough targets per day and in that book he gives a guide on how to build these AI systems now.
I want to emphasize, you know, he writes in the book very, very clearly that these systems are not supposed to replace human judgment, that's what he calls it, you know, mutual learning between humans and artificial intelligence and he says, and the IDF still maintains this , they say that it is Intelligence officers who look at the results and make a decision based on what I heard from numerous sources after October 7, which ceased to be the case at least in some parts of the IDF, where again to Amy, as I said before, sources were told that if they verified that the target was a man they can accept Lavender's recommendations without looking closely without checking why the machine made the decision it made and I think when you talk to sources like just describing how many of these sources, you know, were recruited by the military after October 7th.
Many of them were shocked by the atrocities that occurred on October 7. Their families, their friends, some of them didn't think they would be drafted back into the military. They said, "Okay, we have to go." they realized what they were being asked to do and the things they were involved in. I wouldn't say all six of them are like that, but at least some of them were shocked again by having committed atrocities and being involved in things and killing. families and they felt it was unjustifiable and they felt a responsibility, I think and I felt this also in the previous article in which I wrote The Mass Murder Factory, which talked about another AI machine called The Gospel, they felt the need to share this. information with the world because of the feeling that people are not understanding it, you know they are listening to the military spokesperson and all these narratives that we have been hearing for the last six months and they do not reflect the reality on the ground and I really think you really know that there is now an attack on the RAF coming, these systems could be used there again to kill Palestinians in mass numbers, these attacks and you know they are putting Israeli hostages in D who are still being held unjustifiably. in Gaza and they need to be freed there has to be a ceasefire it cannot continue and I hope that this investigation which lays things out so clearly will help more people around the world to call for a ceasefire to free the hostages and put an end to occupation and move towards a political solution, for me there is no, there is no other way forward.
I wanted to ask if the US military, uh, if US technology is playing a role in Israeli AI, so I don't know and there's some information that I can't fully share, right now I'm investigating, like you know who is involved in the development of these systems. What I can tell you is that you know based on the previous experience of the War of 2014 and the War of 2021. War, um, when, when do you know when the wars end or these systems are sold to militaries around the world, um and I think that , regardless of what you know, the horrible results and consequences of these systems in Gaza, along with that, I really believe that there is a danger to humanity like this AI-based war.
It allows people to escape responsibility. It allows you to really target a mass mass, you know, thousands of 37,000 people marked for possible murder and it allows you to do that. maintain a kind of international law aesthetic because you have a machine that generates a Target file for you with Commander or as Target collateral damage, but it loses all meaning. I mean taking the principle of distinction under international law when you design a system that marks 37,000 people and you check it and you know that 10% of them are actually not militants, right? They are either vaguely related to Hamas or not related at all. at all and yet you authorize the use of that system without any meaningful oversight for weeks, I mean, isn't it a violation of that principle when you authorize the killing of up to 15 or even 20 civilians for targets that from a military point of view you consider not particularly important?
Isn't that a clear violation of that principle? the principle of optionality, you know and and I don't know, I think international law is really in crisis right now, um, and I think these AIA systems are making that crisis even worse, they're draining all these terms, um, of meaning. let me play you a clip of National Security Council spokesman John Kirby being questioned Tuesday about Israel's killing of seven aid workers in three cars from Chef Andre World Central Kitchen. This is Kirby firing a missile at people delivering food. and killing is not a violation of international humanitarian law.
Well, the Israelis have already admitted that this was a mistake they made. They are investigating. They will get to the bottom of this. Let's not get ahead of ourselves. The State Department has a process. and to date, as you and I speak, we have not found any incident where the Israelis have violated international humanitarian law, which is why the chief US spokesman, John Kirby, says that Israel has never violated international law so far since October 7 and again this is in response to a question. on the killing of the seven aid workers, one Palestinian and six international aid workers, can you talk about your response to this attack?
Three different missiles that hit the three cars and then what Kirby said, yeah, wow, that's pretty shocking in my mind, I mean. what he said um, you know, based on the evidence that exists, um, the first thought that came to mind when he was talking about, you know, Israel is investigating it, you know, since I know the statistics, so yeah you take the 2014 bombing, um. and the war in Gaza, so you know, 512 Palestinian children were killed. Israel said it would investigate that there were hundreds of complaints of war crimes, the only file on which the Israeli army actually persecuted the soldiers and it was about looting of like 1,000 shekels everything was closed this happened you know 2018 2019 230 Palestinians are murdered shooting at the border again dozens of one a file processed I like to claim that because Israel is conducting an investigation, it somehow means they are getting to the bottom of this and changing anything is just making a mockery of our intelligence.
I think the second thing I would say is that it's true that the state of Israel has apologized for it, but if you really look at the history of people being killed around eight trucks, this has happened. It happened over and over again for the Palestinians, I mean, in early March, 112 Palestinians were killed around the flower, a truck, the Guardian reported at the time that 14 of those cases occurred like in February and January, so it's Clear to me. that the Israeli army is apologizing not for the crime but for the identity of the people who died in the crime and I think that is really hypocrisy and to answer the question about my findings, I mean, I don't know if the artificial intelligence was involved in that attack.
I don't want to say something I'm not 100% sure about, but what I've learned from Israeli intelligence officers makes it no surprise to me that this attack took place because of the shooting. The policy is completely permissive and we are seeing, I mean, we are seeing unarmed civilians being bombed to death. We saw that video of four people walking and being bombed to death. We have doctors, you know, talking about how in hospitals. They're seeing little kids with bullet holes, like the Guardian investigation who spoke to nine doctors who talked about it, like this, like this, this extreme permissiveness, um and it's not surprising, to me, your article doesn't talk about drones, but Everyone Can you guys talk about how AI systems interact with unmanned attack drones?
Yeah, you know, I said this last time. I can't talk about everything, also because we always have to think about the military sensor. In Israel, Israeli journa

lists

were very, you know, tied down by that, but in Israel the systems interact and you know that if someone is marked to be killed by lavender, then that person could be killed by a war. plane, they could be killed by a drone and they could be killed by a tank that's on the ground like that, there's this kind of policy of sharing intelligence between uh different um yeah different units uh different and and and different weapons operators um, I wouldn't be surprised if because Israel would say you know there was a target, someone like someone we suspected, um, of course, um, the aid workers and they completely rejected it, but I wouldn't be surprised if they marked you.
I know that the Israeli system received um was somehow related to a faulty automated mechanism, I mean you know Mass was monitoring the area and it detected something and if I hadn't known that the higher accuracy rate again from what I heard of sources this is the atmosphere this is the case

If you have any copyright issue, please Contact