•  

Privacy Insider: September 7, 2021

  • by Angelique Carson
  • last updated September 7, 2021
  • 5 min read
Privacy Insider: September 7, 2021

The big news since I wrote you last is the WhatsApp fine, of course. It's always very sexy news when a company gets whacked with a massive fine, and it's especially newsy given widespread criticism over the enforcement of the EU General Data Protection Regulation since it came into force in 2018. 

But I want to talk about something I find even sexier: The power of a collective movement. For those of us working in privacy and data protection, it can sometimes feel lonely and unsatisfying, I think. Tech giants and start-ups will always move faster than emerging laws that time to keep the "move-fast-and-break-things" ethos in check. You probably don't spend as much time on Twitter as I do, and that's a good thing. But it often feels like those of us who care about data privacy are screaming into a void. Facebook finally took a hit, but it's not the only one making privacy missteps. It's just the one paying for it at the moment. 

That's why this week's news that Apple is pausing its plans to rollout a feature that aims to detect child sex abuse media (CSAM). To recap: Apple planned to do on-device scanning of users' iCloud images to hunt for CSAM. Bear with me for the next three sentences while I explain how it'd work, if you're unfamiliar. Apple's iOS would compare a database of known child sex abuse material with a users' photos. The system would use what's called a "NeuralHash," the process of analyzing an image and then assigning it a unique number. Photos would be flagged as a "match" if the scan identified one of the specific numbers assigned to catch child sex abuse media. While many platforms already scan their databases for CSAM, Apple planned to do the scan on the device itself. And that's what had critics incredulous.

You're reaching into my phone? 

The backlash was severe and immediate. World-renowned privacy experts took to Twitter to condemn the plan. And consumer privacy groups, including the Electronic Frontier Foundation (EFF) launched petitions to stop it. Last week, EFF announced it gathered 25,000 signatures. A coalition of more than 90 companies joined EFF, urging Apple to cancel its plans, citing privacy and security concerns. 

And a remarkable thing happened: Apple listened. It will now halt its scanning system rollout to listen to critics and others and revise the game plan. 

I live in Washington, D.C. There's a protest almost daily about one cause or another, some of them smaller than others. I've marched with thousands of people demanding change on whatever the issue may be. We make signs, we walk for hours, we come home exhausted. And I have to be honest: Rarely does anyone listen. It can feel defeating and futile like you're never going to have enough power to influence the people with money and power making decisions behind closed doors. 

Try Osano Free!

That's not to say protest never works. Hey, I learned about the Civil Rights Movement, too. I watched Selma. The fact that we're still talking about those marches is indicative of their significance. 

But these little wins — that Apple heard the critics' chorus and pivoted — should inspire. Maybe you're the only one at your company who's evangelizing privacy. Maybe you're the only person in the board room that raises your hand to say, "I don't think this is a good idea," when a product rollout could impact users. But there are thousands more like you out in the wild; many of them likely signed the petition against Apple's CSAM plans.

Not every example is a headline-maker. But Apple's shift reminds those of us working on user privacy rights that we're fighting the good fight, even if we don't have a loudspeaker or 25,000 signatures behind us. Protests matter. So keep speaking up.

Enjoy reading, and I'll see you next week! 


UK data privacy chief wants tech firms to ditch cookie banners 

The U.K. privacy chief has asked G7 authorities to pressure tech giants to create alternative consent mechanisms for users other than cookie banners, which became pervasive after the EU's sweeping privacy law took effect in 2018. Information Commissioner Elizabeth Denham met with her G7 peers this week and asked that they work together to develop "privacy-oriented solutions," citing user fatigue in dealing with cookie pop-ups at each website. Critics said the ICO should do its job and enforce the existing law, BBC News reports.
Read Story 

Apple pauses plans to search iOS devices for child abuse images

Since announcing its plans to combat child exploitation online last month, Apple has faced significant criticism. The company said it would begin searching iPhones and iCloud for images identified as child abuse. Since then, privacy advocates have launched petitions and expressed outrage on Twitter, calling the policy a threat to democracy itself. Based on that feedback, Apple now says it will pause those plans to take public input and improve the features before implementing them, CNN reports. 
Read Story

Irish regulator fines WhatsApp 225 million euros

Curious about privacy? Find out how Osano automates compliance & saves you time! Learn more

Last week, the Irish Data Protection Authority fined Facebook's messaging service, WhatsApp, nearly $270 million, The New York Times reports. The fine, enforced under the EU's sweeping privacy law, cites WhatsApp's failure to comply with transparency provisions. The Irish regulator said it didn't disclose clearly to users how their data would be shared and used by Facebook. 
Read Story

App faces consolidated lawsuit for alleged unlawful data sharing

Reuters reports that Fertility-tracking app Flo Health faces a consolidated class-action lawsuit accusing it of sharing users' sensitive health information with third parties without users' knowledge. The lawsuit also names as defendants the third parties -- Facebook, Google and two data analytics companies -- alleged to be involved. The U.S. Federal Trade Commission announced a settlement with Flo Health over the unlawful data sharing allegations earlier this year. 
Read Story

What should a privacy law look like in the US? 

In this primer on U.S. privacy law (or the lack thereof), The New York Times reports on the data economy underpinning technology products and services, often invisible to the consumer. While the U.S. waits on Congress to cobble together a federal privacy law, states like Colorado, Virginia and California are passing their own. The privacy experts who spoke with the NYT said any federal law should include, at a minimum, rules on data collection and sharing, opt-in consent to data collection and nondiscrimination clauses for consumers who choose to opt-out.  
Read Story

Critics: UK plans to 'fix the GDPR' a bad deal for consumers

"The UK has left the European Union. It's left the single market. And now it wants to leave behind the rules that require companies protect your personal data." That's according to this WIRED report on the U.K.'s post-Brexit plans to do what away with the EU privacy law's "pointless bureaucracy," "box ticking" and red tape, as the UK privacy regulator said. U.K. plans to pass a revised law to fix the GDPR are still vague, but critics say it's a bad deal for consumers and that the U.K. wants to "have its cake and eat it too."  
Read Story

About The Author · Angelique Carson

Angelique Carson is the Director of Content at Osano, a B-corp privacy platform that makes compliance with privacy laws easy for companies of all sizes. She is a professional writer and editor who has worked in journalism and publishing for more than ten years. Previously Angelique was an editor at the International Association of Privacy Professionals and the host of The Privacy Advisor Podcast. She lives in Washington, D.C., with her puppy Miles.