Privacy Impact Assessment Guide: 7 Best Practices to Master PIAs
One of data privacy’s greatest challenges is that it can all feel...Read Now
July 30, 2021
Here at Osano, we name our major product feature releases after privacy heroes. Here's why our July feature release, Text Customization for Consent Manager, is named after Helen Nissenbaum.
One of the reasons it can be difficult for people to understand their privacy rights or for companies to make intelligent decisions is that privacy isn't easily defined. That's why Helen Nissenbaum's contribution to privacy theory is so significant.
Nissenbaum developed one of the most critical theoretical underpinnings for regulating data uses. While the U.S. framework currently relies on giving users "notice" and "choice" when companies seek to collect their data, Nissenbaum's work says that instead, the focus should be on "contextual integrity."
A real-life example of this might be helpful here. Think about Facebook's News Feed feature, which debuted in 2006 and "fed" users real-time updates on what their friends were posting; whether they'd broken up with their boyfriends and if they'd added any new connections, among other notifications. Not everyone liked the intrusion. One million people joined a "Facebook News Feed protest group."
The crux of the problem was that users didn't expect Facebook to peer so closely into their lives and display that to their online connections. Offline, we understand how much privacy we expect based on context. If I close the door to my office, everyone understands I want space and privacy. If I'm sitting at a cafe, close to a friend and chatting quietly, no one would assume it'd be fine to stand nearby and cock an ear our way. But online, that context is difficult to perceive. So when Facebook's News Feed displays broadly that I broke up with my boyfriend, it feels invasive suddenly. Just because I signed up for Facebook and clicked "yes" to whatever I had to so I could join doesn't mean I'm expecting my personal details to be treated like a news scroll on cable TV.
In their article, “Contextual gaps: Privacy issues on Facebook,” Gordon Hull, Celine Latulipe and Heather Richter Lopfod write, "Offline, privacy is mediated by highly granular social contexts. Online contexts, including social networking sites, lack much of this granularity. These contextual gaps are at the root of many of the sites' privacy issues."
And that's what Nissenbaum would have advised Facebook to avoid out of respect for consumer privacy. (In the end, Facebook ignored the protests and the feature lives on.) In an age when companies often seem more interested in dodging liability than gaining customer trust, Nissenbaum's theory asks companies to do better. A contextual integrity framework looks at the data-use holistically. It requires data collectors to take an ethical approach that considers societal norms, which will always evolve. It asks data collectors to look at the context of the situation to determine what that "norm" is. Is the data subject in this context a patient? An online shopper? An existing customer? Is the data collector an online retailer? A doctor? What kind of information is being collected, and how will it be used? Once you take all of those factors into account, you can start to apply what the societal "norm" would be in the context at hand.
If Facebook looked at its plans through a contextual integrity lens, it likely would have decided the News Feed didn't align with users' expectations in joining the site and forking over their data.
In addition to her ethical contributions to the field, Nissenbaum, a professor of information science at Cornell Tech in New York City, is also director of the Digital Life Initiative. Launched in 2017, it aims to study the "societal tensions arising from existing and emerging digital technologies. It takes into consideration ethics, policy, politics and quality of life.
Among other accolades, Nissenbaum received the International Association for Computing and Philosophy's 2021 Covey Award for her contributions to computing, ethics and philosophy.
Her books include Obfuscation: A User's Guide to Privacy and Protest; Values at Play in Digital Games; and Privacy in Context: Technology, Policy and the Integrity of Social Life.
The Osano staff is a diverse team of free thinkers who enjoy working as part of a distributed team with the common goal of working to make a more transparent internet. Occasionally, the team writes under the pen name of our mascot, “Penny, the Privacy Pro.”