For most of my life, even working in privacy, I had never really considered how technology design has the power to influence decisions I make online. In my former job, I hosted a privacy podcast. To prepare for an interview with Professor Woodrow Hartzog, I read his book, "Privacy's Blueprint: The Battle to Control the Design of New Technologies." Its central message is: "Our consent is pre-ordained." And it hit me between the eyes.
At a talk on the book, Hartzog explained, "Every single design decision is meant to affect the world in some particular way." Sometimes those design choices are simple: A friendly text choice and clever message, for example, indicates the text I'm reading is harmless, playful even. Being asked to click a button reading "Are you cool with our data collection?😍 Click yasssss to continue," seems a bit less severe a decision than "Clicking 'yes' means we sell your personal data to the highest bidder."
Hartzog calls the way technologies convince us that we're making autonomous choices about our privacy a "DDOS attack on our brains." Making those real-time decisions constantly is overwhelming and time-consuming. How many times are we asked, "Do you want this app to know your location all the time? Just when you're using the app? Never?"
While such questions ostensibly offer users control over what's happening with their data, they're often designed in such a way to encourage us to say "yes." That's what Hartzog means when he says our consent is pre-ordained. No one has time or energy to dissect the implications of saying yes to a company's terms and conditions when you're just trying to get to the service. If I download the travel app Waze for car directions, whatever I have to click to get to using the service is what I'm going to click. "Take my data. Whatever. I'm late for the doctor, and I'll do whatever you're asking me to do to get there."
We also see these tactics in "dark patterns," subtle techniques that aim to influence users online. An example might be an ad that says, "Do you want a discount on Ted's Tacos?" And you love Ted's Tacos. That sweet potato burrito is gold! You've got two options: A button that says "yes," and will collect some data from you for clicking it, and a button that says, "no thanks, I'd rather spend a ton of money," and lets you on your way.
It's a subtle, playful tug at most consumers' attempts to be frugal, but it aims to influence. Certainly, there are far more nefarious examples. We all encounter these choices every day as participants in the digital economy.
As Hartzog says, "Design should be loyal to the user. It shouldn't elevate the interests of the platform in unreasonable ways over the interest of the data subject."
Loyalty. Now that's a novel idea.
Hartzog's research is especially relevant after California's move to prohibit dark patterns under California's Consumer Privacy Act. Could this be the beginning of tech design that's loyal to the user? Hartzog and I, at the very least, sure hope so.
Enjoy reading the news we've rounded up for you, and I'll see you next week!
Dark patterns now illegal under California privacy law
On March 15, California Attorney General Xavier Becerra announced new regulations under California's Consumer Privacy Act that ban the use of dark patterns. The rules aim to prevent tricks websites or apps use to encourage users to make decisions online that they wouldn't normally. The California Attorney General's office describes dark patterns as "confusing language or unnecessary steps such as forced clicking or scrolling through multiple screens or listening to why you shouldn't opt-out of their data sale." The tactics are "more widespread than you'd imagine," Gizmodo reports.
2. Lawmaker reintroduces federal privacy bill
U.S. Rep. Suzan DelBene, D-Wash., has reintroduced a federal privacy bill, Bank Info Security reports. The Information Transparency and Personal Data Control Act does not include a private right of action as a consumer recourse. That could give it bipartisan strength given Republicans' aversion to such a provision in previous bills. It would also allocate $350 million to the Federal Trade Commission to hire 500 additional employees to focus on privacy and security enforcement. DelBene has introduced versions of the bill three times prior.
3. Chinese tech companies testing system to bypass Apple privacy rules
Tech giants in China are working to bypass Apple's new privacy rules and track users without their consent, 9To5Mac reports. The Financial Times broke the story, noting the China Advertising Association has launched a new way to track iPhone users called CAID. The association referred to CAID in an app developers' guide, and TikTok's parent company, ByteDance, is currently testing the system, the report states.
4. Digital currency on the rise, privacy concerns linger
Digital currency systems are on the rise globally, including among governmental functions. But the potential privacy impacts don't yet have solutions. In distributing stimulus checks to help Americans through the COVID-19 fallout, U.S. government authorities considered using "digital dollars." But Rohan Grey, a law professor at Willamette University, said, "... what if the end result is a surveilled bank account system? Suddenly now you're talking about building a monetary system where every transaction could be stored as data and create a robust social graph of the United States."
5. Negotiations on a new Privacy Shield could take years
Companies anxious for Privacy Shield's replacement, some news: It doesn't look good. On March 9, the Wall Street Journal reported that negotiations could take "years rather than months." That puts businesses needing to transfer data from the EU to the U.S. in a pinch. There are alternate mechanisms to use, but they can be costly and time-consuming. This Osano report discusses the current state of affairs and what to expect.
6. Opinion: Apple's privacy labels aren't a sure bet
In an opinion piece for Computerworld, Jonny Evans writes that despite Apple's push to limit apps' ability to track users, the company "doesn't apply the same rules to itself." Evans said Apple's Privacy Nutrition Labels aim to help users understand how apps use their data. Still, Apple's site reads in small print at the bottom, "This information has not been verified by Apple." Meaning that Apple "isn't deeply policing what developers claim."