Milwaukee Police Propose Trading Millions of Mugshots for Free Facial Recognition Access

featured-image

The proposal raises serious privacy and ethical questions.

From its development to its application, the facial recognition industry is rife with shady practices. But things can always get shadier. In Milwaukee, police are now considering an almost cartoonishly evil deal: Trade 2.

5 million mugshots to a private company in exchange for free access to facial recognition software. On Friday, the Milwaukee Journal Sentinel reported that police officials announced the potential deal at the city’s Fire and Police Commissions meeting last week. Per the outlet, Milwaukee police have previously borrowed access to facial recognition technology from neighboring agencies.



With this deal, the department would receive two free search licenses from Biometrica, a software firm that already works with law enforcement agencies in the United States, in exchange for mugshots and jail records spanning decades. Although Biometrica’s plans for these mugshots aren’t confirmed, it will likely use them to train its software. Biometrica did not respond to Gizmodo’s request for comment.

However, facial recognition is regularly trained on stolen or borrowed datasets. For example, Clearview AI scraped millions of photos from social media for its database that it sells to police, and PimEyes stole pictures of dead people for its algorithm. The National Institute of Standards and Technology also maintains its own mugshot database , along with images of vulnerable people for facial recognition testing.

In an email to Gizmodo, Milwaukee police confirmed that it has not entered into any contract yet. The department plans to continue the discussion at future city meetings. A representative wrote that “being transparent with the community that we serve far outweighs the urgency to acquire.

” Even without a firm deal, the proposal alone rings alarm bells. Facial recognition’s inaccuracies at identifying dark-skinned people (especially if they are women or non-binary) are well-documented. Unsurprisingly, it has led to “multiple multiple wrongful arrests.

..due to police reliance on incorrect face recognition results — and those are just the known cases,” David Gwidt, a spokesperson for the American Civil Liberties Union of Wisconsin, told Gizmodo via email.

“In nearly every one of those instances, the person wrongfully arrested was Black.” That’s not the only issue with this deal, though. As of now, the proposed agreement mentions nothing about informing individuals, receiving their consent, or allowing them to opt out.

Like most states, Wisconsin doesn’t have any specific biometric privacy laws. Of the few that exist, only Illinois expands its regulations beyond solely addressing commercial use. The only firm legislation to refer back to is on how mugshots are regulated.

Generally, they are public records, and Wisconsin is an open records state, so arrest records, including mugshots, are available to the public with limited exceptions. Although this all suggests that Milwaukee police aren’t legally required to notify individuals or obtain consent, it’s still sketchy. Let’s ignore how many people simply don’t want their face to be used to train surveillance technology.

Facial recognition companies aren’t immune to security issues like data breaches. Per Forbes, biometric breaches can expose people to identity theft or be used to bypass other security systems. It’s not like people can just change their face.

Which raises the question: Should the Milwaukee police be able to take this risk on someone else’s behalf? The United States has an established history of skirting ethics and exploiting marginalized communities, especially in the name of advancing technology. Hello, Tuskegee. This deal would simply continue that legacy in a digital context.

As Jeramie Scott, Senior Counsel at EPIC, told Gizmodo via email, “The irony here is that the Milwaukee police are considering offering millions of mugshots that most likely are disproportionately of people of color in order to train a surveillance technology that will likely be used disproportionately on people of color.” Furthermore, Scott noted that doing so would “exacerbat[e] the historical racial inequalities in the criminal justice system.” Comprehensive federal regulation on facial recognition is unlikely to come anytime soon.

Although Wisconsin’s capital, Madison, banned the technology in 2020, the state itself has none either, and Milwaukee also doesn’t regulate the police department’s existing surveillance technology. In Scott’s eyes, “The safest thing to do would be to not go forward with this deal and for the Milwaukee police to refrain from using the technology, particularly when there are no laws in place to strictly limit its use and provide meaningful safeguards.” Last week, the local ACLU called on Milwaukee to place a two-year pause on any new surveillance technology.

It also asked that the city develop regulations for existing ones while providing opportunities for community members to weigh in. Although Milwaukee’s police department says it will craft a policy to ensure no one is arrested solely off facial recognition matches, there’s nothing to keep them accountable..