Tuesday, 12 March 2019

ACLU: Florida Is Using Facial Recognition to Convict People Without Giving Them a Chance to Challenge the Tech

Florida Is Using Facial Recognition to Convict People Without Giving Them a Chance to Challenge the Tech
The Constitution requires the government to give defendants the chance to test technology used against them for bugs and biases.

If the government uses an error-prone face recognition system to identify you as the perpetrator of a crime, you have a constitutional right to probe its accuracy before you are convicted based on its results. But amazingly, a Florida appeals court disagrees.

So on Monday, the ACLU, the Electronic Frontier Foundation, the Georgetown Center on Privacy and Technology, and the Innocence Project filed a friend-of-the-court brief urging the Florida Supreme Court to address this important issue.

What’s at stake is not just the outcome of this case, but the very meaning of due process in the face of rapidly evolving police technology.

In 2015, two undercover cops purchased $50 of crack cocaine from a Black man on a Jacksonville street. Instead of arresting him on the spot, one officer used an old phone to snap several photos of him. Trying to be discrete, the officer took the photos while holding the phone to his ear and pretending to be on a call. Needless to say, the resulting photos were not headshot quality.

Later, after failing to identify the man themselves, the officers emailed the photos to a crime analyst, who used a statewide face recognition system to see if one of the photos looked like any mugshots in the county’s database. The program spit out several possible matches, the first of which was Willie Lynch, who was soon arrested and charged with the drug sale.

Face surveillance tools are dangerous whether they are accurate or not. But that danger is magnified because many such algorithms produce biased and inaccurate results. These problems are well-documented in commonly used face recognition algorithms. Not only are the systems known to be less accurate when two photographs contain different lighting, shadows, backgrounds, poses, or expressions, but the systems have been shown to perform less accurately on photos of Black people, women, and younger people. This is partly because the data sets used to train the algorithms have historically been composed of face images that are not representative of the full diversity of the population.

Last year, the ACLU of Northern California highlighted how this bias plays out in Amazon’s face recognition system, called Rekognition, which the company is aggressively marketing to law enforcement. The program falsely matched photographs of 28 members of Congress with mugshot photos. While people of color made up approximately 20 percent of members of Congress generally, they constituted nearly 40 percent of the false matches returned by the algorithm.

All of these factors converge in Lynch’s case. Lynch is a Black man, and the quality of the cop’s photos wasn’t fit for Instagram, much less taking a man’s liberty.

Moreover, in Lynch’s case, the algorithm expressed only one star of confidence that it had generated the correct match. (The system rates the likelihood of a match by assigning stars to the resulting photos). When Lynch’s defense attorney questioned the analyst who used the system before trial, she explained that she did not understand how the face recognition algorithm worked. Yet, even with that understanding, a mere one-star confidence rating was enough for the analyst to send Lynch’s photo — along with his prior criminal history — on to the officers, asserting that he was the likely perpetrator. The officers accepted her suggestive recommendation without further investigation.

At trial, the entire basis of the state’s case against Lynch was the officers’ testimony that they recognized him as the man who sold them drugs. Lynch’s defense was that they had the wrong guy, and so he needed to be able to test the reliability of the officers’ claim.

Because the officers only identified him based on the results of the face recognition program, looking into how the face recognition algorithm functioned and whether errors or irregularities in the procedure tainted the officers’ ID was critical to his case. But when Lynch asked for the other booking photos the program returned as possible matches, both the government and the court refused. Their refusal violated the Constitution, which requires prosecutors to disclose information favorable to a defendant.

And that requirement makes sense. Imagine you were a juror in Lynch’s trial, and instead of face recognition technology, the government relied on a witness who saw the drug sale occur. If the witness testified that she was “one-star confident” that Lynch was the drug seller, wouldn’t you want to know if she had thought any other individuals, besides Lynch, were equally likely to be the guy? And wouldn’t you want to know what exactly she meant by “one-star confident”?

The U.S. Supreme Court decided long ago that due process requires prosecutors to give jurors and defendants full access to this sort of information to ensure that defendants can make their case and so that judges and jurors can make fully informed decisions on grave questions of guilt and innocence. Getting an algorithm involved doesn’t change these fundamental principles.

And these concerns aren’t limited to this case. In Florida alone, more than 240 law enforcement agencies use the face recognition algorithm at issue in this case, with more than 5,000 users conducting up to 8,000 searches per month. You might expect that this would have generated a flood of cases where defense attorneys challenged the accuracy of the system. But as far as we can tell, Lynch’s case is the first of its kind.

Prosecutorial misconduct and police adoption of face recognition technology are dangerous, and the ACLU has been pushing to halt both. Until that happens, prosecutors must give defendants full access to information about the algorithm used against them in places where face recognition technology has already been deployed. This includes the underlying model, training data, computer code, explanatory documentation, and any other results from which the final, reported result was chosen. Any validation studies should also be available as well as the opportunity to question the people who use and created the system.

Only then can defendants identified by face recognition truly get a fair trial — a right that prosecutors are obligated to facilitate. We hope that the Florida Supreme Court recognizes how important this is for the criminal justice system.



Published March 13, 2019 at 02:45AM
via ACLU https://ift.tt/2T1Z1Ps

No comments:

Post a Comment