Júlia Nueno, Gaza genocide researcher: ‘Wars escalate because AI predicts culprits even before the crime is committed’
The Barcelona-based engineer participated in the Forensic Architecture report for the case against Israel in The Hague


Júlia Nueno has one goal: to define the legal and methodological tools to bring military Artificial Intelligence (AI) to trial. At 31, this Barcelona-born computer engineer is finishing her doctoral fellowship at Goldsmiths University with the agency Forensic Architecture, where she investigates the use of AI by police and military forces through evidence found in the physical space. “I worked for two years for Airwars, the organization that analyzes deaths in armed conflict, and I started my PhD on October 1, 2023, a week before October 7, when everything changed and the agency focused on documenting what happened in Gaza,” she explains on a Friday during a brief stop in Barcelona before returning to London, where she lives. The documents from that investigation are the ones that have been presented in the South Africa v. Israel case, brought before the International Court of Justice of the United Nations (ICJ), which seeks to demonstrate that Israel has undertaken actions to destroy the Palestinian population that go beyond self-defense. On June 18, an exhibition with an installation by Forensic Architecture about this research will be on display at the Three Chimneys in Barcelona as part of UIA, the international congress of architects.
Nueno knew she would end up at Forensic Architecture after seeing the 2017 MACBA exhibition dedicated to this multidisciplinary organization that applies forensic rigor to human rights violations, extrajudicial killings, and state crimes. She has also edited and coordinated Genocides (Galaxia Gutenberg, 2024), an essay that redefines the term and warns of how military AI is changing the value we assign to human life. “Our proposal regarding the idea of genocide goes beyond physical extermination. Genocide also includes destroying the medical system, agriculture, or attacks on humanitarian aid. It is the annihilation of the conditions that sustain existence.”
Question. You say that current censorship is characterized by excess, not absence.
Answer. We are disoriented and cynical. For two and a half years, we have witnessed a genocide unfolding before our eyes, which we have come to accept as normal due to the flood of images — images we no longer know whether they are real or not.
Q. How does Forensic Architecture respond to that scenario?
A. In this confusing era, we could either discard the framework of truth altogether or reaffirm that truth is understandable, but we must initiate a process to reconstruct it. Our task is to pay attention to the videos, images, and messages circulating online, mapping them to generate a complete picture of the incident. A video only provides a partial perspective, but by combining it with others in a digital model, we can understand a sequence of events. We used to say that a picture was worth a thousand words; now, a thousand words can no longer be worth a picture. An image is verified by relating it to other images.
Q. The Gulf countries have arrested hundreds of people for posts about Iran’s attacks. That makes it difficult, for example, for you to do your job.
A. These arrests are a clear example of how those in power try to control the narrative. When a missile falls on a refinery or a neighborhood, the state cannot deny what everyone is seeing; what it tries to do is hide the meaning of that attack and prevent it from damaging its reputation. In this way, those in power organize what can be seen and what must remain hidden, setting the limits of what is credible. In this case, the prohibition is used to undermine the testimony of people who, often at great risk, record the violence they experience firsthand.
Q. This is also a war of images; the Saudi hashtag ‘to photograph is to serve the enemy’ has gone viral.
A. The government doesn’t see the camera as a tool for testimony, but as a weapon. By invoking national security reasons to arrest anyone holding a camera, the focus shifts from what happened to the legitimacy of its dissemination. The debate ceases to be about the war and becomes about the image. The impact of the missile no longer matters; what matters now is who recorded it and what their motivation was. In this way, the authorities turn the situation on its head: the state ceases to be responsible for security and presents itself as a victim of disinformation, claiming legal protection against the evidence that demonstrates its own vulnerability.
Q. That’s why your work matters.
A. Our goal, beyond obtaining evidence, is to enable people to understand, compare, and share it publicly. For something to be true, we need society to be able to verify it and react collectively. In this scenario, persecuting citizens who watch and record has clear objectives: to silence the real impact of the war and sow fear so that no one else dares to watch, but also to prevent that collective verification process from taking place.

Q. What does the Forensic Architecture investigation presented by South Africa in the trial against Israel reveal?
A. We gathered all the available videos and information from residents and people in Gaza to reconstruct the bombing campaign. We also had satellite maps of the destruction in Gaza, published monthly by the UN, showing the structures destroyed in the Strip, but we realized that this top-down analysis didn’t allow us to understand exactly how the bombing campaign was being carried out. We worked from the bottom up, and this cross-referencing of information revealed that residential areas were bombed more at night — when there were more civilians at home resting — and commercial areas during the day — when there were more civilian crowds in the markets. This proves that Israel always bombs where there is a greater civilian presence, maximizing the possible damage.
Q. And it justifies those methods by relying on AI.
A. [In Gaza], AI helps to criminalize citizens because they will always be guilty to some degree. One of the technologies [Israel] uses, the Lavender system, assigns the population a score of between one and 100 indicating how likely they are to be part of the armed resistance. If there is no score of zero, there is no possibility of innocence.
Israel revealed that they had developed a tool that allows them to locate a teenager’s home a week before he even knows he is a terrorist. The army no longer acts in reaction to a violent act, but rather under the guise of prevention
Q. You point out how the Israeli military developed surveillance technology in the West Bank that seems straight out of a dark version of the movie Minority Report.
A. In 2015, there was a wave of lone attacks in the West Bank, reflecting years of dismantling social organization and political opposition. The Israeli army began using social media to monitor the population and identify potential attackers. An engineer from Shin Bet [also known as Shabak, Israel’s internal security and counterintelligence agency] revealed that they had developed a tool that allows them to locate a teenager’s home a week before he even knows he is a terrorist. The army no longer acts in reaction to a violent act, but rather under the guise of prevention.
Q. You say that the military sphere has adopted this stance, of the so-called “society of objectives.”
A. Social media is a targeting system. When we use it, we receive specific advertising because there are usage patterns that generate a profile, making us a target for certain advertising. The ICE system, used by the U.S. Immigration police, is also analyzing these patterns to see who might be an illegal immigrant. There are similarities in how Big Tech, the police, and the military use probability calculation and profiling systems. Power no longer waits for you to commit a subversive act; it relies on the system predicting that you might.
Q. That’s the AI that justifies today’s wars.
A. It’s a recursive, self-justifying system. It tells you: I have data that justifies killing this individual because he was an identified enemy. This constant creation of targets is a way of justifying future wars. Israel and the United States are relentlessly generating new targets, and AI is their discursive tool for justifying future wars. Wars escalate because they predict culprits even before the crime is committed.
Q. Where will justice be served in the face of these acts?
A. At Forensic Architecture, we have provided documentation to the South African legal team, but we are not naive. We know that the international order is in question. That is why we are multiplying the forums where we present our evidence: arms embargo cases, debate spaces, databases on the criminalization of pro-Palestinian protesters in England or Germany. We believe that there is no single space that defines justice. Justice must be defined by the Palestinian people, and it will be through equality, return, and restitution. When I am asked what will happen with this case, I answer that history has always been written by the victors. Even if this trial does not lead to where we hope, history must also be written by the vanquished.
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition
Tu suscripción se está usando en otro dispositivo
¿Quieres añadir otro usuario a tu suscripción?
Si continúas leyendo en este dispositivo, no se podrá leer en el otro.
FlechaTu suscripción se está usando en otro dispositivo y solo puedes acceder a EL PAÍS desde un dispositivo a la vez.
Si quieres compartir tu cuenta, cambia tu suscripción a la modalidad Premium, así podrás añadir otro usuario. Cada uno accederá con su propia cuenta de email, lo que os permitirá personalizar vuestra experiencia en EL PAÍS.
¿Tienes una suscripción de empresa? Accede aquí para contratar más cuentas.
En el caso de no saber quién está usando tu cuenta, te recomendamos cambiar tu contraseña aquí.
Si decides continuar compartiendo tu cuenta, este mensaje se mostrará en tu dispositivo y en el de la otra persona que está usando tu cuenta de forma indefinida, afectando a tu experiencia de lectura. Puedes consultar aquí los términos y condiciones de la suscripción digital.








































