The use of facial recognition by the police is again questioned in the United States after the complaint of a black woman arrested in February, and released after ten hours, for a crime she did not have committed.

Michigan resident Porcha Woodruff filed a wrongful arrest complaint last week against the city of Detroit and the investigating police officer in a violent car theft case.

Ms. Woodruff, identified as a suspect after a search with facial recognition software, spent “approximately eleven hours standing or sitting on a concrete bench” at the police station, while eight months pregnant, the complaint alleges. Fifteen days later, a court dismissed the charges against her, for lack of evidence. “Given the well-known flaws in facial recognition technologies, which tend to err, the Detroit police violated plaintiff’s rights by failing to protect her from foreseeable errors and their consequences,” her attorneys state in the statement. complaint.

A decried technology

This artificial intelligence technology has been criticized for years by human rights defenders. In particular, they denounce the fact that algorithms, trained from predominantly white populations, make more errors on black people. In Ms. Woodruff’s case, the police used footage recorded by a CCTV camera at a gas station.

“Armed robbery?” Are you kidding? You see I’m eight months pregnant? “, she launched to the police who showed up at her home on February 16 with an arrest warrant. Despite the protests of her two children, her fiancé and her mother contacted by telephone, she “was led away, searched and handcuffed in front of her family and neighbors”, the complaint states. After a day at the police station, she went to the hospital, which found her “low heart rate due to dehydration.” And “she [there] learned that she had contractions due to the stress inflicted”.

The complaint criticizes the police for not having “put in place appropriate rules for the use of this technology” and for not having “adequately trained its employees”, which “reveals a willful indifference towards of the potential harm suffered by the erroneously identified persons”.

In the United States, faced with pressure from associations, large groups such as Amazon, Microsoft, IBM and Google have stopped, at least temporarily, selling their facial recognition software to the police.