
Clearview AI’s massive fine for GDPR violations — and what it means
“Move fast and break things.”
“It’s easier to ask for forgiveness than to get permission.”
“F*** around and find out.”
What inspired this somewhat circular set of clichés to start bouncing around in my head is the news from early September that Clearview AI, which provides facial-recognition software and services, has been fined €30.5 million (roughly $34 million) by the Dutch Data Protection Authority (DPA) for violating the EU’s General Data Protection Regulation (GDPR).
In case you’re not fully up to speed on GDPR, it is one of the most stringent data-protection regulations in the world, first enacted in 2018. Its basic premise is that if you collect and store any personal data about any EU resident, you must have their permission and you must treat it very carefully to ensure it is secure.
GDPR includes a requirement that the data must only be stored and processed in the EU — that’s one reason Barracuda now has considerably more data storage infrastructure overseas than we did before GDPR was enacted.
Billions of faces
Clearview AI’s breathtaking fine was imposed because, according to the DPA, the company built an “illegal database with billions of photos of faces.” As the company’s own website states, “Our platform, powered by facial recognition technology, includes the largest known database of 50+ billion facial images sourced from public-only web sources, including news media, mugshot websites, public social media, and other open sources.”
This is Clearview AI’s business model: Build a huge database of faces with associated names, analyze the images to assign a unique biometric code to each face, and then deliver a service to police, military, and intelligence organizations that lets them scan new images — say, surveillance footage, or photos from a political protest — to identify the people in them.
Clearly this can have positive uses, such as helping to solve crimes. Just as clearly, it can be put to more sinister uses, such as helping a dictatorial regime identify and punish political opponents.
Flouting the law
The GDPR is very clear on this: You cannot collect people’s data, including facial biometric data, without their consent or knowledge, and without informing them fully about what the data will be used for. You also have to provide a way for individuals to access their data upon request.
For its part, Clearview AI claims that it is not subject to GDPR regulations because it does not have a place of business in the EU — which in my non-expert opinion is a completely disingenuous argument that doesn’t absolve them of anything.
According to Dutch DPA chairman Aleid Wolfsen, “We are now going to investigate if we can hold the management of the company personally liable and fine them for directing those violations. That liability already exists if directors know that the GDPR is being violated, have the authority to stop that, but omit to do so, and in this way consciously accept those violations.”
If you doubt that Clearview AI’s management knew they were violating GDPR and embraced it as a deliberate strategy, please contact me, I have a bridge to sell you.
No. Clearly this is an example of the first two aphorisms with which I opened this post. And if the company and its directors end up having to pay hefty fines, it also exemplifies the third one.
Like Uber moving aggressively into cities in obvious violation of their livery and limousine dispatch rules, Clearview AI likely hopes that by establishing itself as the only provider of its brand of service, and by building up a customer base of governments and their agencies, it can insulate itself from enforcement of privacy laws.
Wolfsen issued a warning: “‘Clearview breaks the law, and this makes using the services of Clearview illegal. Dutch organizations that use Clearview may therefore expect hefty fines from the Dutch DPA.” But once people got used to the convenience of ride-sharing, regulators had no choice but to accommodate them. Similarly, it’s hard to see police and intelligence agencies willingly giving up the power to put a name to any face in any photo.
Let’s get compliant
If nothing else, the €30.5 million fine imposed on Clearview AI demonstrates that European regulators are deadly serious about enforcing GDPR. So, for the sake of argument, let’s say that your organization would prefer to comply with GDPR and other data-privacy rules — and avoid the risk of being subjected to massive fines.
One important step is to make sure you know exactly where protected customer and other sensitive data is stored, and to remediate any potential exposures. Barracuda Data Inspector automatically scans your SharePoint and OneDrive data to both identify sensitive data that’s improperly stored, and to find and eliminate malware or other malicious files. It’s a great way to increase your peace of mind and confidence that you’re staying compliant and not at risk of paying massive fines.

The Ransomware Insights Report 2025
Key findings about the experience and impact of ransomware on organizations worldwide
Subscribe to the Barracuda Blog.
Sign up to receive threat spotlights, industry commentary, and more.

Managed Vulnerability Security: Faster remediation, fewer risks, easier compliance
See how easy it can be to find the vulnerabilities cybercriminals want to exploit