inovate

Clearview AI to stop selling facial recognition tool to private firms

JOCQUNDAGUI6XILXO5S7FGUVEQ.jpgw1440
Placeholder while loading article actions

A facial recognition company whose massive database was used to identify Russian soldiers killed in Ukraine has agreed to halt sales to private companies in the United States in a landmark settlement limiting criticized technology as threatening Americans’ right to privacy.

The settlement, filed Monday in federal court in Illinois, marks the largest lawsuit to date against Clearview AI, a company known for downloading billions of photos of people from social media and other websites to create a face search database sold to law enforcement.

It also highlights how a single state’s privacy law can have nationwide ramifications for protecting Americans’ civil rights. The lawsuit, filed by the American Civil Liberties Union in 2020, accused Clearview of violating an Illinois law that prohibits companies from sharing people’s facial photos, fingerprints and other biometric information without their consent.

“This is a real vindication of states’ ability to protect people from the worst forms of abusive corporate surveillance,” said Nathan Freed Wessler, attorney and deputy director of the Speech, Privacy and Technology Project at the ACLU.

Ukraine scans faces of dead Russians, then contacts mothers

In a statement, a Clearview attorney, famed First Amendment attorney Floyd Abrams, welcomed the end of the litigation and said he would not force the company to change its current business model. But he also acknowledged the lawsuit changed the way the company did business, saying the company agreed not to work with the Illinois agencies “to avoid a protracted, costly and distracting legal dispute.” .

New York-based Clearview argued in court that Illinois law restricted the company’s ability to collect and analyze public information — and, therefore, violated its freedom of speech protected under the first amendment.

Clearview chief executive Hoan Ton-That said in a statement that the company intends to sell its facial recognition algorithm to commercial customers, without its face database, in a “based on the consent”.

READ MORE:  Screening Colonoscopy with Add-On AI Could Save $290 million a Year

Many such algorithms are already offered for sale and require customers to connect their own databases, for example when a company wants only its employees to be able to unlock facial recognition systems for secure doors. Clearview, however, has long promoted its face database as a distinguishing feature, and the settlement could significantly limit its future prospects.

The Illinois law, passed in 2008, led to several major technology privacy settlements, including a $650 million settlement from Facebook related to its use of facial recognition.

Facial recognition company Clearview AI tells investors it’s looking for massive expansion beyond law enforcement

The United States has no federal law on facial recognition, even though the technology has been used by thousands of local, state and federal law enforcement agencies, including to charge Americans with crimes.

The United States “doesn’t have a comprehensive privacy law, even one that protects those most sensitive and immutable identifiers,” like people’s faces, Wessler said. “Congress should act — and, until they can, more states should pick up the slack.”

As part of the settlement, which will become final once approved by the court, Clearview has agreed to stop selling or offering free access to its facial recognition database to most businesses and other private entities nationwide.

The company also agreed to stop working with any law enforcement or government agencies in Illinois for five years and to continue trying to screen photos taken or uploaded from the state.

Clearview has created an opt-out form that Illinois residents can use to request that their photos not appear in its search results. The company said it would spend $50,000 to pay for online ads advertising the form. The company offers a similar request form for California residents covered by the state’s Consumer Privacy Act.

Clearview said it would also stop offering free trial accounts to police officers without their supervisors’ approval. These accounts had allowed individual officers to conduct research outside of their agencies’ investigative protocols and chain of command and become what Wessler called “a veritable recipe for abuse.”

READ MORE:  Whales, Waves, a Diver, Sea Turtles, and Fish All Hit The Streets of SoHo

Opinion: The absence of a federal privacy law opens the door to dystopia

The Government Accountability Office, a federal watchdog, said last year that 13 federal agencies did not know which facial recognition systems their own employees were using, meaning the agencies had “therefore not fully assessed” the potential privacy and system accuracy risks.

The ACLU sued Clearview on behalf of groups representing immigrants, sex workers and survivors of domestic violence, arguing that they suffered extraordinary harm as a result of the police identification tool.

Illinois’ biometric information privacy law offers the strongest protections in the nation for people’s sensitive health information, and no other state has passed a similar law. The Health Insurance Portability and Accountability Act, or HIPAA, limits how hospitals and other “covered entities” share people’s health information, but it doesn’t cover the sharing of user data by tech companies.

Facebook agreed to pay $650 million in 2020 to settle a class action lawsuit accusing it of breaking Illinois law and last year said it would stop using its facial recognition software widely known and would delete the facial data of more than a billion people, citing “growing concerns about the use of this technology as a whole”.

The settlement comes at a time when Clearview is racing to woo investors and raise tens of millions of dollars to expand its business around the world. In a investor presentation starting in December, first reported by the Washington Post, the company said it hoped to increase sales to private companies in financial services, real estate, the “gig economy” and other industries, and that it was working to expand its facial database to 100 billion photos so that “almost everyone in the world is identifiable”.

READ MORE:  Yupitergrad 2 Announced With Metroidvania Focus

Clearview had for months offered its search tool to stores and other private businesses, but it has since limited it to police and government with formal approval. The company offered to create other facial recognition products for private businesses, including the types of identity verification systems used to unlock doors or access bank accounts, and said this tool would not overlap its main law enforcement database.

Clearview’s database now includes more than 20 billion photos taken from the Internet, and its search tool allows users to submit a photo and get links to the original website or social media account of the picture.

The system has been used by police in the United States to identify protesters and criminals, including rioters at the US Capitol on January 6, 2021. In Ukraine, officials have used it to identify dead Russian soldiers, find their social media accounts and contact their families.

Facebook, Google and other big tech companies sent legal orders demanding that Clearview remove all uploaded images from their servers, but Clearview refused. “I don’t think we want to live in a world where any big tech company can send out a cease and desist and then control, you know, the public square,” said Ton-That, CEO of the company, to the Post in a live interview last month.

“Clearview built a product that no other company wanted to build, because of its dangerousness, and this settlement justifies the decision” by Google, Amazon and other companies to suspend or end their sales plans facial recognition systems for business or police use, Wessler said. “Other companies should take note. Violating people’s information privacy rights is not free. They will end up being held liable at great financial and reputational cost.



Source link

Leave a Comment