Face recognition software, as far as is known, can be wrong. Not for the first time, this combination has resulted in someone innocent being arrested. This time it accidentally hit an eight-month-pregnant woman in Detroit. Especially people with dark skin are often affected by an AI bias, as in this case.
Advertisement
Porcha Woodruff reported on her case to the New York Times. She was allegedly involved in robbing someone at gunpoint and stealing the car. Face recognition software identified Woodruff on a video showing parts of the incident. The victim was also given a photo of Woodruff for identification — albeit an eight-year-old picture. However, the victim did not say that the woman involved in the theft was pregnant – after all, she wasn’t either. The police apparently slipped through this detail.
According to the indictment, six police officers visited the heavily pregnant woman at home when she was getting her children ready for school. She is said to have initially thought the arrest warrant was a bad joke, not least because she was obviously heavily pregnant. However, the police took the woman away.
Face recognition keeps going wrong
The current case is not the first in the United States, nor in Detroit. However, Woodruff is the first woman affected. So far, the misjudgments of facial recognition software have mainly affected men with dark skin. In Georgia, for example, a man was jailed for a week after he was mistakenly identified as a thief for stealing very expensive handbags. This man also faced charges, according to which he was flagged down in a car on his way to a Thanksgiving dinner and informed of the warrant. The person concerned is said to be around 20 kilos lighter than the thief recorded by surveillance cameras. Even more striking: He was also missing a prominent birthmark on his face.
Other cases are known from New York, for example. A man was held there for 30 hours before being released on bail. At least two other cases are known from Detroit.
Automated facial recognition is also playing an increasingly important role for German security authorities, although the traffic light coalition actually rejects “the use of biometric recording for surveillance purposes” and a dispute is currently raging at EU level. At the beginning of 2023, the local police had saved almost 6.7 million photographs of a good 4.6 million people in the Inpol-Zentral joint file.
Advertisement
Also problematic: The use of facial recognition software in the private sector. In New York’s Madison Square Garden there are admission controls by camera and face recognition. These are intended to increase the safety of spectators and employees. Whether this is legal will be decided in court. However, the software was also used to exclude lawyers involved in the court cases from the events. They could be identified at the entrance and turned away.
(emw)
Go to home page
#Face #recognition #failed #heavily #pregnant #woman #wrongly #arrested