From bringing up PSIT Bill to strengthen the IoT devices security to collaborating with other countries to boost cybersecurity, the U.K. government is in full action to thwart growing security incidents in the country. Recently, the Information Commissioner’s Office (ICO) in the U.K. imposed a potential fine of £17 million ($22.6m) on Clearview AI Inc for violating data protection laws. The agency also issued a provisional notice to Clearview stating to stop the processing of users’ personal data in the U.K. and to delete any data in its possession.
Clearview AI is a facial recognition platform that provides software to companies, law enforcement, universities, and individuals.
The penalty comes after a joint investigation by the ICO and the Office of the Australian Information Commissioner (OAIC) that revealed Clearview had violated privacy laws by harvesting users’ sensitive information without their consent and unfair methods. The investigation found that Clearview leveraged users’ images, data scraped from the internet, and their biometric details for its facial recognition platform.
Customers of Clearview are provided an image to the company to carry out biometric searches, including facial recognition searches, on their behalf to identify relevant facial image results against a database of over 10 billion images.
“The images in Clearview AI Inc’s database are likely to include the data of a substantial number of people from the U.K. and may have been gathered without people’s knowledge from publicly available information online, including social media platforms. The ICO also understands that the service provided by Clearview AI Inc was used on a free trial basis by a number of U.K. law enforcement agencies, but that this trial was discontinued and Clearview AI Inc’s services are no longer being offered in the U.K.,” the ICO said in a statement.
According to the ICO, Clearview has failed to comply with the U.K. data protection laws in several ways, including:
- Failing to process the information of people in the U.K. in a way they are likely to expect or that is fair
- Failing to have a process in place to stop the data from being retained indefinitely
- Failing to have a lawful reason for collecting the information
- Failing to meet the higher data protection standards required for biometric data (classed as ‘special category data’ under the GDPR and U.K. GDPR)
- Failing to inform people in the U.K. about what is happening to their data
- Asking for additional personal information, including photos, which may have acted as a disincentive to individuals who wish to object to their data being processed
Clearview Triggered Identity Threats: OAIC
Earlier, Australia’s privacy watchdog, the Office of the Australian Information Commissioner (OAIC), stated that Clearview did not take any steps to stop collecting scraped images of Australians, generating image vectors from those images, and disclosing any Australians in matched images to its registered users. The agency stated the exposure of Clearview’s intrusive practices would certainly cause security concerns across various government officials.
Commenting on the proposed fine, the U.K. Information Commissioner, Elizabeth Denham, said, “I have significant concerns that personal data was processed in a way that nobody in the U.K. will have expected. It is, therefore, only right that the ICO alerts people to the scale of this potential breach and the proposed action we’re taking. U.K. data protection legislation does not stop the effective use of technology to fight crime but to enjoy public trust and confidence in their products, technology providers must ensure people’s legal protections are respected and complied with.
“Clearview AI Inc’s services are no longer being offered in the U.K. However, the evidence we’ve gathered and analyzed suggests Clearview AI Inc were and maybe continue to process significant volumes of U.K. people’s information without their knowledge. We, therefore, want to assure the U.K. public that we are considering these alleged breaches and taking them very seriously.”