- cross-posted to:
- europe@feddit.de
- cross-posted to:
- europe@feddit.de
"A company which enables its clients to search a database of billions of images scraped from the internet for matches to a particular face has won an appeal against the UK’s privacy watchdog.
Last year, Clearview AI was fined more than £7.5m by the Information Commissioner’s Office (ICO) for unlawfully storing facial images.
Privacy International (who helped bring the original case I believe) responded to this on Mastodon:
"The first 33 pages of the judgment explain with great detail and clarity why Clearview falls squarely within the bounds of GDPR. Clearview’s activities are entirely “related to the monitoring of behaviour” of UK data subjects.
In essence, what Clearview does is large-scale processing of a highly intrusive nature. That, the Tribunal agreed.
BUT in the last 2 pages the Tribunal tells us that because Clearview only sells to foreign governments, it doesn’t fall under UK GDPR jurisdiction.
So Clearview would have been subject to GDPR if it sold its services to UK police or government authorities or commercial entities, but because it doesn’t, it can do whatever the hell it wants with UK people’s data - this is at best puzzling, at worst nonsensical."
Wouldn’t a UK court only concern itself with the activities of a company operating in the UK? If this company does not operate in the UK I’m surprised it’s got far enough to need overturning
Because it operates on the data of UK residents.
The internet has made everything really weird in terms of jurisdictions. You can have photos of UK citizens taken in the UK and stored on a UK server, and if a company from somewhere else scrapes the data without permission and moves it out the UK, that doesn’t obviously mean that it’s now fine to use for whatever.
Now of course the law has to have some jurisdictional limits, but it’s not surprising that there has been some disagreement about where they are.
It’s because it’s the data protection act which is the UK implementation of GDPR.