In Ukraine, the identification of the dead is associated with a violation of human rights

1 year ago

Five days later Russia launched a full-scale invasion of Ukraine a year ago, this week US facial recognition company Clearview AI offered the Ukrainian government free access to its technology, suggesting it could be used to reunite families, identify Russian operatives and combat disinformation. Shortly thereafter, the Ukrainian government announced that it was using the technology to scan the faces of dead Russian soldiers in order to identify their bodies and notify their families. By December 2022, Mykhailo Fedorov, Vice Prime Minister of Ukraine and Minister of Digital Transformation, was tweet his photo with Clearview AI CEO Hoan Ton-Tat thanking the company for its support.

Accounting for the dead and informing families about the fate of their relatives is human rights imperative written down in international treaties, protocols and laws such as Geneva Conventions and the International Committee of the Red Cross (ICRC) Guidelines for Decent Treatment of the Dead. It also comes with much deeper commitments. Caring for the dead is one of the most ancient human practices that makes us human in the same way as language and the ability to self-reflection. Historian Thomas Lacker in his epic meditation The work of the dead, writes that “since people have discussed this topic, care for the dead has been considered fundamental – religion, state, clan, tribe, the ability to mourn, the understanding of finiteness.” life, civilization itself. But identifying the dead with facial recognition technology uses the moral weight of this kind of concern to resolve a technology that raises serious human rights issues.

In Ukraine the bloodiest war in Europe after World War II, facial recognition may seem like just another tool used for the grim task of identifying the fallen, along with digitization of mortuary documents, mobile DNA laboratoriesAnd exhumation of mass graves.

But does it work? Ton-Tot says his company’s technology “works effectively regardless of facial injury that may have occurred to the deceased person.” There are few studies to support this claim, but the authors one small study found the results “promising” even for faces in the decomposition state. However, forensic anthropologist Louis Fondebrider, a former head of the ICRC’s forensic service who has worked in conflict zones around the world, casts doubt on these claims. “This technology lacks scientific credibility,” he says. “This is absolutely not accepted in the forensic community.” (DNA identification remains the gold standard.) The field of forensics “understands the technology and the importance of new developments,” but the drive to use facial recognition is “a combination of politics and business with very little science,” according to Fondebrider. “There are no magic solutions for identification,” he says.

Using untested technology to identify fallen soldiers can lead to mistakes and traumatize families. But even if the forensic use of facial recognition technology has been backed up by scientific evidence, it should not be used to identify victims. It’s too dangerous for the living.

Organizations, including amnesty international, Electronic Frontier FoundationThe Surveillance Technology Oversight Project and the Immigrant Defense Project have declared facial recognition technology a form of mass surveillance that threatens confidentialityreinforces racist policethreatens right to protestand may lead to wrongful arrest. Damini Sathia, Head of Amnesty International’s Algorithmic Reporting Lab and Deputy Director Amnesty Technology, says facial recognition technology undermines human rights by “reproducing structural discrimination at scale and automating and perpetuating existing social inequalities.” In Russia, faces confession technologies used to suppress political dissent. This does not comply with the law And Ethical standards when used by law enforcement in the UK and US, and armed against marginalized communities around V world.

clearview The AI, which primarily sells its wares to the police, has one of the largest face photo databases known, with 20 billion images, and plans to collect another 100 billion images, the equivalent of 14 photos for every person on the planet. The company has promised investors that soon “almost everyone in the world can be identified.” Regulators in Italy, Australia, the UK and France have declared the Clearview database illegal and have ordered the company to remove photos of its citizens. WEIGHT, Restore your facea coalition of over 40 civil society organizations called for a total ban on facial recognition technology.

AI ethics researcher Stephanie Hare says Ukraine is “using a tool and promoting a company and a CEO who behaved not only unethically, but also illegally.” She suggests that this is a case of “the end justifies the means,” but asks, “Why is it so important that Ukraine be able to identify dead Russian soldiers using Clearview AI? How important is this for protecting Ukraine or winning the war?”


Leave a Reply