There have been a couple of late-breaking privacy news developments this morning, which both wholly obliterated my "wake-up-slow-and-sip-coffee" policy and are also worth discussing.
First, Facebook has decided to ban advertisers from preying on sensitive topic categories like religion or politics or health conditions. For example, advertisers can no longer target terms like "chemotherapy," according to an Axios report. Which seems fair, no?
Second, Google won an appeal in a U.K. case over user privacy. As TechCrunch reports, the lawsuit alleged Google used a Safari workaround to override iPhone users' privacy settings in Apple's browser between 2011 and 2012. The case also sought class-action status. The headline here is that the court said, "Nay." Under U.K. data protection law, the judges said that each individual claimant must have suffered material damage or distress to be entitled to compensation. And that opens up a widely discussed philosophical discussion we love to have in privacy circles: When it comes to privacy, what does "harm" look like?
I'll admit to you here that I only get excited about privacy law and compliance here and there. The fact that Ohio's drafted a law that allows companies a safe harbor if they comply with NIST's framework? Somewhat cool and exciting. Or the idea that you have to notify people under China's new privacy law if you think it's even possible that authorized people gained access to their data? Woah. How does that even work?
But the philosophical aspects to privacy law, that stuff lights me up. Here's the thing: We don't have societal expectations for "privacy harm" yet. Not in the digital age. And, as Danielle Citron and Dan Solove write, "Countless privacy violations are left unremedied not because they are unworthy of being addressed but because of the failure to recognize harm."
The term "harm" is a heavy focus in academic and advocacy circles. Proving harm, sometimes called "injury in fact," is often the threshold plaintiffs can't get over. And part of that is that we don't share a standard definition. What feels like harm to me might not feel like harm to you.
Let's say there was a data breach. You and I both got notified that our data, including Social Security numbers, were accessed by an unauthorized entity. There's no evidence that someone has since used the data to steal our identity. And to you, that might mean no harm's been done. But I have a lot of anxiety. The thought alone that at any point in the future, someone could assume my identity and destroy my credit, among other damages, certainly would feel like tangible harm. I can't show you my anxiety over it, no. But maybe I'm losing sleep some nights and don't perform well at work the next day. Is that a harm?
The bottom line as confirmed by @UKSupremeCourt is that the right to compensation under UK data protection law can only be exercised by proving material damage or distress to each individual concerned.— Eduardo Ustaran (@EUstaran) November 10, 2021
So far, courts have a hard time recognizing those types of intangibles. They feel reluctant to make judicial decisions based on something that could happen. Business owner: Before you get upset with me, understand that I'm not promoting allowing people to sue every time their feelings were hurt. The main goal should be holding bad actors responsible for bad actions -- not ambulance-chasing. But I do think the ideas we have about what's harmful in a data-driven world will evolve in the next five or 10 years. At least judicially.
It's an exciting space to watch. It's sure to evolve as courts and legislative bodies become savvier on privacy in a world where companies have stockpiles of data waiting to be exposed. After all, as the saying goes, "It's not if you're breached, it's when." And if there's no harm done when that data falls into the wrong hands, why even have privacy laws at all?
There's more on this at the links below, where I've rounded up the top privacy news stories. Enjoy reading, and I'll see you next week!
China’s new privacy law: How to comply
On Nov. 1, the brand-new Personal Information Protection Law (PIPL) came into force in the People’s Republic of China. It is a comprehensive, modern, globally consistent law on par with the EU’s General Data Protection Regulation, and you must understand it if you’re collecting the personal information of people residing within China’s borders. “The law is not, however, particularly difficult to understand, nor is it overly complex,” writes Sam Pfeifle in this primer.
Google wins appeal in UK lawsuit alleging privacy harms
Google has won an appeal against a lawsuit at the U.K. Supreme Court over user privacy. In the 2017 lawsuit, plaintiffs alleged Google used a Safari workaround to override iPhone users’ privacy settings in Apple’s browser between 2011 and 2012, TechCrunch reports. The case sought up to $4 million and aimed to establish a class-action style lawsuit for data protection violations, “despite the lack of a generic class action regime in U.K. law.”
Advertising industry group says Belgium to deem consent framework illegal
The European arm of the Interactive Advertising Bureau (IAB) released a statement last week indicating it expects Belgium’s data privacy authority (DPA) to find it in violation of the EU Data Protection Regulation. The IAB Europe’s Transparency and Consent Framework aimed to simplify ensuring user consent to the processing of personal data or the placement of cookies through every link of the digital advertising chain. But the IAB Europe expects the DPA to deem the framework illegal in a draft decision to come within two weeks.
Facebook’s new parent company blocks ad targeting using religion, health, politics
Facebook’s parent company, Meta, has announced that starting on Jan. 19, it will no longer allow advertisers to select terms for ad targeting that are related to sensitive identifiers, Axios reports. The ban will apply to Facebook, Instagram and Messenger and will forbid using traits including ethnicity, political affiliation, religion or sexual orientation. Meta will also prohibit ads targeting health conditions by using terms like “chemotherapy.”
Privacy-focused startups capitalize on big techs’ missteps
Reuters reports that major tech companies’ privacy missteps and the regulatory action that often follows are pushing users toward privacy-focused startups. Signal, the encrypted messaging app, saw “unprecedented growth” this year. Search engine DuckDuckGo, which doesn’t track users, said it saw a 38 percent increase on average daily searches. Even so, privacy-focused startups still feel the pain of trying to compete with the tech giants and are “optimistic about the potential impact of regulatory initiatives that have been launched around the world,” the report states.
Companies report revenue losses following Apple privacy update
Reuters reports on the impact Apple’s privacy update in April has had on social media platforms. The update allowed users to opt out of advertiser tracking. Facebook has said the change will crush small businesses because they would no longer be able to find local customers to target with ads cost-effectively. Meta Platforms, Snap, and Peloton Interactive said they missed their targeted revenue goals since the change.