July 5, 2022

| The Atlanta Journal-Constitution| June 30, 2022

County Commission chairwoman: more details needed before vote on three-year agreement with Clearview AI.

The Cobb County Police Department wants to take a step forward with its use of face recognition technology.

The department is currently using the controversial technology on a trial basis. But it has asked the County Commission to approve a three-year contract with Clearview AI — a company that has come under fire for collecting images on social media and gathering biometric data without consent.

Like hundreds of other police departments in the U.S., Cobb County police officers use face recognition to help identify suspects in investigations through artificial intelligence. An officer can submit an image to software that will scan it and find matching faces in a database, creating new leads and helping solve crimes much more quickly in some cases, advocates say.

Cobb police officials declined interview requests for this story but issued a statement saying face recognition has helped solve a murder case that had been unresolved for several years. The statement also said officers “only use facial recognition for official law enforcement business, and each use must have a legitimate reason for the search in the incident report as well as the database.”

Cobb police use the technology “to enhance public safety while continuing to act with regard to the basic privacy of the public we serve,” the statement says.

But some lawmakers and privacy activists have concerns over how the images are obtained by Clearview AI, how the images are used by law enforcement, how the software can misidentify people of color and the lack of restrictions to curb the potential for misuse.

Use of biometric information is becoming more common in daily life — from face scanners in the airport to fingerprint unlocking features on smartphones. Face prints, fingerprints, eye scans and voice recognition are some forms of biometric data that are unique to each individual and can be used as identifiers.

Clearview AI has collected over 10 billion public images from the internet, including social media networks, in a database which law enforcement agencies can pay to access. The company’s artificial intelligence software creates unique prints for each individual based on face structure. The face prints can be matched with other images to identify individuals, sometimes in seconds.

Christopher Bruce, an attorney with the American Civil Liberties Union of Georgia, said police should not have access to that much information.

“Police officers should not just be able to have this treasure trove of your data, your whereabouts, who you are, especially without you having consented to it,” he said.

Credit: Brian Eason

In a public discussion posted online in 2021, Cobb County Police Chief Stuart VanHoozer said the department opts out of utilizing social media images and has a database of only “book-in photos.”

“Some systems do sift social media. We elect not to do that. We don’t do that, and our system doesn’t do that,” he said.

The department’s four-page policy on face recognition technology does not address which images officers have access to, but it does specify that they need more than a face match to make an arrest.

“We don’t make cases on a match,” VanHoozer said in the presentation. “All that tells us is a starting point.”

ACLU sues Clearview AI

The ACLU sued Clearview AI in 2020 under the Illinois Biometric Information Privacy Act, which prohibits companies from collecting biometric data without consent. The ACLU argued the company had violated the law by creating face prints from images without consent.

The settlement reached in May now prohibits Clearview AI from allowing private companies to access its face vector database nationwide and limits access to Clearview AI’s database to law enforcement, government agencies and, in some cases, government contractors.

Nate Freed Wessler, a deputy director of the ACLU’s project on speech, privacy and technology, worked on the lawsuit against Clearview AI. He said even if the technology was 100% accurate, it would still raise concerns.

“Face recognition technology is both dangerous when it works and dangerous when it doesn’t work,” Wessler said. “It really gives the government an incredible, unprecedented power to instantaneously identify any of us.”

According to data collected by the Electronic Frontier Foundation's Atlas of Surveillance Project, police and sheriff departments in 24 states employ facial recognition technology.

Some states have passed laws regulating the use of face recognition technology and other biometric data. This year alone, 11 states have introduced 19 bills on face recognition technology, and four have been enacted, according to a database created by the National Conference of State Legislatures (NCSL), a nonpartisan association that tracks such laws.

Georgia has not passed any regulations addressing the issue.

“Facial technology has become more accessible,” said AJ Wagner, a policy analyst for the NCSL. “It’s still seeing a lot of restrictions, but they’re not outright prohibitions.”

Cupid wants more information

Cobb Commission ChairwomanLisa Cupid said in an interview that she asked the police chief to withdraw the contract request until he had spoken more with the commissioners about how the technology is used.

Cupid said the contract will not be voted on until police leadership can show “how this will work, and how it will not result in any disparate impacts to our citizens,” particularly people of color.

Credit: Branden Camp

The technology can be particularly fallible when matching people of color, which can lead to wrongful arrests, Bruce said.

Experts also said that while false arrests are concerning, the technology will someday be 100% perfect, which is just as dangerous.

“Inaccuracy is a problem, but inaccuracy alone isn’t the reason to beware of facial recognition technology,” said Deven Desai, a professor at Georgia Tech with a focus on law, ethics and technology. “The bigger problem is the way it can be abused and becomes an avenue to surveillance without due process.”

The Cobb County Police Department’s policy says officers must undergo training on the use of the technology; conduct peer reviews of each case using face recognition software; gather additional evidence; and receive supervisor approval before proceeding with charges.

The policy also says the technology cannot be used to conduct surveillance.

The Atlanta Police Department is the only other confirmed police agency in Georgia that uses face recognition. Its policy requires that officers go through two levels of approval before searching the database to determine if the technology is the appropriate tool for the investigation and whether the procedures have been followed.

But ACLU lawyer Nate Freed Wessler said more safeguards are needed.

“If they’re going to use this technology at all, it needs to be highly constrained to only the most serious types of crimes, only with a search warrant issued by a judge based on probable cause,” he said.