The mental picture of police dusting for fingerprints may be the most iconic image of 20th-century law enforcement. Gone are the days when investigators would rely on such quaint procedures, as biometrics – the use of each individual’s unique biological features for identification is advancing at breakneck pace. Lawmakers have yet to catch up to the tech companies developing tools like facial recognition software, but that hasn’t prevented law enforcement from deciding it needs the technology – in the most covert way possible.
While taking fingerprints and other personal information has been subject to strict scrutiny and oversight, no such restrictions have been placed on artificial intelligence software that can identify and collect data on members of the public from their facial features – whether they are criminals or not.
London resident Chris Johnson was stopped by police because he covered his face after learning that facial recognition technology was active in the area; eventually he was fined around $120 for swearing at an officer who confronted him. While Johnson’s case made it to the newspapers, the reality is that most people captured by this technology have no idea that it has even happened – especially in the U.S.
“I’m Not a Criminal”
Since 2016, London’s Metropolitan Police have undertaken a series of trials using facial recognition stations placed around the city and elsewhere in England. Privacy advocate groups Liberty and Big Brother Watch have objected to the trials, on the grounds that they violate the privacy of residents, as well as the potential implications for freedom of speech. “It’s not a trial because people aren’t consenting to being involved,” said Hannah Couchman of the group Liberty. “It’s just a deployment of a technology that hasn’t been consented to, hasn’t been debated in parliament, there’s no law, there’s no framework. It’s an enormous risk to human rights.”
While the police have repeatedly insisted they have made the monitored zones known to the public by handing out leaflets and posting signs, passers-by surveyed by the Independent newspaper said they didn’t realize they had just been scanned. “One of Liberty’s key concerns is that this is supposed to be a trial, but by the time you’re informed that it’s happening you’re probably already on camera,” said Couchman.
…states have provided it with access to the facial data of millions of Americans in the form of driver’s license photos.
One man who saw signs saying that the area was under facial recognition surveillance decided to pull his shirt up over his face but was then targeted by police. Johnson, who was fined for public disorder after voicing objections to his face being analyzed, said, “I don’t want my face on anything – I’m not a criminal. If I want to cover my face, I’ll cover my face, but the police officer, because I swore, gave me a fine – all because I decided to walk down the street and I didn’t want them looking at my face, which I think I’ve got a right to do.”
A statement previously released by the Met Police said, “While anyone who declines to be scanned will not necessarily be viewed as suspicious, officers will use their judgement to identify any potential suspicious behavior.” According to Silkie Carlo, director of Big Brother Watch, however, “Some of the officers here have told us that if a person turns away from the van because they don’t want to be scanned, which is their right, that they will quite likely be stopped and potentially ID’d.” Couchman also claimed that police officers had informed her that “people looking nervous and walking away, or covering their faces,” could prompt further action.
Do You Trust the FBI?
But London police look like amateurs compared to the FBI, which has access to a whopping 411 million facial images of Americans, according to the Government Accountability Office, most of whom are law- abiding citizens. What’s more, the Bureau didn’t feel the need to inform the public about this, or the fact that it was using the photos to identify persons of interest.
In 2010, the FBI began developing its $1.2 billion Next Generation Identification (NGI) database that seeks to use biometric tools, including facial recognition software. Far from keeping the public informed about its investigative procedures, the agency used facial recognition technology for five years without submitting the legally required privacy impact assessment, a failure that prompted the House Committee on Oversight and Reform to conduct a hearing in 2017 to examine the implications of the technology. Jason Chaffetz (R-UT), former committee chairman, can be seen grilling Kimberly Del Greco, deputy assistant director of the FBI’s Criminal Justice Information Services Division, in this entertaining video (the entire hearing can be viewed on the committee’s official channel here):
While the FBI officially keeps only mugshot photos, states have provided it with access to the facial data of millions of Americans in the form of driver’s license photos. As of 2017, the FBI had agreements with 16 states, which would provide the agency with photo information of residents – with multiple states in the process of negotiating similar arrangements.
“I have zero confidence in the FBI and the [Justice Department], frankly, to keep this in check,” said Rep. Stephen Lynch (D-MA) at the hearing. “This is really Nazi Germany here, what we’re talking about,” he added, “and I see little difference in the way people are being tracked under this, just getting one wide net and getting information on all American citizens.” He recommended that at the very least, law enforcement should be required to obtain warrants to conduct facial recognition searches.
Naturally, this slap on the wrist didn’t curtail the FBI’s trajectory, and in January 2019, it was revealed that the agency has also been piloting Amazon’s artificial intelligence “Rekognition” software, supposedly because its investigations into terrorist figures like Las Vegas shooter Stephen Paddock are too slow without the technology.
Mini Police States
On the local level, one-quarter of state and local police departments have access to facial recognition databases, which they search without a warrant or even reasonable suspicion of a crime, according to a report by the Center on Privacy and Technology at Georgetown Law. Real-time scanning is also in the works, and the ACLU discovered in 2018 that various regions, including Orlando, FL, have started testing Rekognition with real-time street cameras to detect suspects – reportedly, Amazon is practically giving away the system to police departments for $6-$12 per month. The sheriff’s department in Washington County, OR, has been using the software in an app that officers can use to take photos of residents’ faces, as has San Diego. Detroit and Massachusetts are known to have purchased similar software from other companies.
All this, without addressing the problem that the program isn’t always accurate, and with research showing that accuracy is considerably lower when scanning dark-skinned, as well as female, faces – raising concerns over not only privacy and freedom, but also gender and racial discrimination. Now, that could be offered as proof of institutional racism and patriarchy, perhaps a hollow “victory” for the intersectional crowd; the ACLU has condemned the technology and launched a petition to stop the government purchasing the software. However, it’s hard to decide whether this technology is more dangerous if it’s accurate or inaccurate.
“It kinda seems like if you’re not committing a crime, then you don’t have anything to worry about,” one resident told NBC San Diego upon hearing that the local police have access to portable facial recognition software and are considering installing it into body-cams. But who has the power to decide what is a crime and who are the criminals? If the government machine holds the power to track and identify anybody it chooses in real-time, it seems a full-blown police state must be soon to follow.