The headline of this story is that the UK’s Information Commissioner’s Office has fined Clearview AI Inc £7,552,800 for using images of people that were collected from websites and social media. Those images could have been yours and you wouldn’t know. And the images have been used to create a global online database for facial recognition which is being used worldwide.
The 46 page decision is interesting in that the Commissioner makes it clear why he thinks the ICO has jurisdiction over a US company (registered in New York State) that as far we we know currently does no business in the UK. Clearview, who cooperated with the ICO througout, disagreed.
The ICO has also issued an enforcement notice, requiring Clearview AI to stop obtaining and using the personal data of UK residents that is publicly available on the internet, and to delete the data of UK residents from its systems.
Clearview AI has trialled its services to various Police Forces in the UK or at least I assume they are Police Forces:
“The Commissioner understands that at least 5 UK law enforcement organisations used the service during the UK Test Phase, and that some of them returned matches for individuals of interest to them.”
The returned matches were considered confirmation that the data of UK residents was held in Clearview’s databases.
The Commissioner also said he was satisfied on a balance of probabilities that Clearview is not currently offering services customers established in the UK (whether in the law enforcement sector or otherwise).
It will be interesting to see what Clearview AI does. Do they appeal, or do they refuse to pay the fine? And if they do refuse what does the ICO do then?
We shouldn’t underestimate what is described as the “dissuasive” impact of the decision, in a world were web scraping becomes part of public policy and our photos are everywhere.