Arrested woman says AI got it wrong, sues City of Detroit

MichiganCrime
Collaborator: Rachael Schuit
Published: 08/13/2023, 3:14 PM
Edited: 08/14/2023, 4:00 AM
0
0
0

Photo Courtesy: U.S. District Court Eastern District of Michigan

(DETROIT, Mich.) As artificial intelligence continues to gain traction within society, facial recognition technology has become a tool some police departments use to identify suspects in criminal cases. 

The City of Detroit is facing a lawsuit from a woman who says she was wrongfully arrested and imprisoned over the improper use of the technology. 

The incident happened back in February when Porsha Woodruff, a Detroit woman, was eight months pregnant. 

According to the lawsuit filed by Woodruff in the U.S. District Court in the Eastern District of Michigan, she was arrested at her home on the morning of February 16 as the suspect in a carjacking case. 

A man who told authorities he was carjacked identified Woodruff from a photo lineup of six women. 

Woodruff says her arrest happened in front of her two children and neighbors.

According to the lawsuit, the photo of Woodruff used in the lineup was from an arrest in 2015, and not her license photo from 2021. 

LaShauntia Oliver, the Detroit Police Detective who requested the warrant for Woodruff’s arrest, is also named in the lawsuit, which alleges she did not check if Woodruff was pregnant before arresting and detaining her. 

The lawsuit alleges that the Detroit Police Department has not trained its officers to use facial recognition properly. 

It also states, “Given the publicly known flaws of facial recognition technology are prone to misidentifying individuals DEFENDENT’s DETROIT Police Department violated her fourth amendment rights by failing to guard against foreseeable errors and their consequences.”

Detroit Police Chief James E. White addressed the incident in a press conference on Wednesday. 

“There have been many reports that the individual arrested was because of misidentification in facial recognition and that is factually incorrect. That is not the case. However, what is true is that the arrest emanated from unfortunately a poor investigation,” said White. 

White said the victim should not have been shown a photo from facial recognition technology in the lineup.

“I have no reason to conclude at this time that there has been any violations of the DPD facial recognition policy, however I have determined that there’s been a number of policy violations by the lead investigator in this case,” White said.

To prevent future incidents like this one, White announced the implementation of three immediate reforms for the department. 

“No member will be allowed to use facial recognition derived images in a photographic lineup period,” said White in the announcement. “You cannot put a facial recognition photo in a photo lineup because it’s going to generate at the least, a look alike.”

Additionally, when showing lineups of suspects to victims, DPD investigators will now have to put each of the six photos in their own envelopes and show them individually to the victim. 

The victim will be asked to answer yes or no if the photo they see is the suspect.

The investigator who knows who the suspect is will not be allowed to show the victim the photos. 

“An uninvolved detective who does not know who the suspect is will take those six photos and they will do the sequential draw,” said White. 

White also added, “Prior to conducting a photographic lineup, a supervisor is to ensure that there is independent basis for believing that the suspect who is pictured in the photo lineup or in the sequential draw has the means, ability, and opportunity to commit the crime.”

This is not the first time the Detroit Police have come under fire for using facial recognition technology. 

In 2021, the American Civil Liberties Union (ACLU) was a part of a lawsuit demanding that the Detroit Police Department make changes to its facial recognition technology. 

The lawsuit was filed after Robert Williams was arrested by the Detroit Police and held in a cell for 30 hours until law enforcement realized the computer identified the wrong suspect. 

The ACLU has been against facial recognition technology being used by law enforcement because the organization says it perpetuates racial bias against people of color. 

Comments

This story has no comments yet