Clearview AI, the facial recognition software maker, on Monday settled a lawsuit brought by the American Civil Liberties Union and agreed to limit its face database in the United States primarily to government agencies and not allow most American companies to have access to it.
Under the settlement, which was filed with an Illinois state court, Clearview will not sell its database of what it said were more than 20 billion facial photos to most private individuals and businesses in the country. But the company can largely still sell that database to federal and state agencies.
The agreement is the latest blow to the New York-based start-up, which built its facial recognition software by scraping photos from the web and popular sites, such as Facebook, LinkedIn and Instagram. Clearview then sold its software to local police departments and government agencies, including the FBI and Immigration and Customs Enforcement.
But its technology has been deemed illegal in Canada, Australia and parts of Europe for violating privacy laws. Clearview also faces a provisional $ 22.6 million fine in Britain, as well as a 20 million euro fine from Italy’s data protection agency.
“Clearview can no longer treat people’s unique biometric identifiers as an unrestricted source of profits,” said Nathan Freed Wessler, a deputy director with the ACLU’s Speech, Privacy, and Technology Project, in a statement about the settlement. “Other companies would be wise to take note, and other states should follow Illinois’ lead in enacting strong biometric privacy laws.”
Floyd Abrams, a First Amendment expert hired by Clearview to defend the company’s right to gather publicly available information and make it searchable, said the company was “pleased to put this litigation behind it.”
“To avoid a protracted, costly and distracting legal dispute with the ACLU and others, Clearview AI has agreed to continue to not provide its services to law enforcement agencies in Illinois for a period of time,” he said.
The ACLU filed its lawsuit in May 2020 on behalf of groups representing victims of domestic violence, undocumented immigrants and sex workers. The group accused Clearview of violating Illinois’s Biometric Information Privacy Act, a state law that prohibits private entities from using citizens’ bodily identifiers, including algorithmic maps of their faces, without consent.
“This is a huge win for the most vulnerable people in Illinois,” said Linda Xóchitl Tortolero, a plaintiff in the case and head of Mujeres Latinas en Acción, an advocacy group for survivors of sexual assault and domestic violence. “For a lot of Latinas, many who are undocumented and have low levels of IT or social media literacy, not understanding how technology can be used against you is a huge challenge.”
One of Clearview’s sales methods was to offer free trials to potential customers, including private businesses, government employees and police officers. Under the settlement, the company will have a more formal process around trial accounts, ensuring that individual police officers have permission from their employers to use the facial recognition app.
Clearview is also prohibited from selling to any Illinois-based entity, private or public, for five years as part of the agreement. After that, it can resume doing business with local or state law enforcement agencies in the state, Mr. Wessler said.
In a key exception, Clearview will still be able to provide its database to US banks and financial institutions under a carve out in BIPA.
The settlement does not mean that Clearview cannot sell any product to corporations. It will still be able to sell its facial recognition algorithm, without the database of 20 billion images, to companies. Its algorithm helps match people’s faces to any database that a customer provides.
As part of the settlement, Clearview did not admit any liability and agreed to pay $ 250,000 in attorneys’ fees to the plaintiffs. The settlement is subject to approval by an Illinois state judge.