Articles

Who is Helen Nissenbaum?

Written by Osano Staff | July 30, 2021

Here at Osano, we name our major product feature releases after privacy heroes. Here's why our July feature release, Text Customization for Consent Manager, is named after Helen Nissenbaum. 

One of the reasons it can be difficult for people to understand their privacy rights or for companies to make intelligent decisions is that privacy isn't easily defined. That's why Helen Nissenbaum's contribution to privacy theory is so significant. 

Nissenbaum developed one of the most critical theoretical underpinnings for regulating data uses. While the U.S. framework currently relies on giving users "notice" and "choice" when companies seek to collect their data, Nissenbaum's work says that instead, the focus should be on "contextual integrity." 

Using contextual integrity means companies should evaluate whether the personal data they're collecting will be used in ways other than the initial purpose for which it was collected. Things can get slippery for companies sometimes in defining a customer's "reasonable" expectation of how you'll treat their data. Let's say you collect data from a customer to provide them a service, and then you want to sell that data to a company seeking to serve personalized ads. The Contextual Integrity Theory would say that you need to look at whether the customer would reasonably expect the data to be used for this purpose in light of the overall context in which they provided that data. Telling customers that you do this in a privacy policy may help, but it may not be enough on its own given that we know customers don't generally read the privacy policy and wouldn't likely expect their data to be used for this purpose. 

A real-life example of this might be helpful here.  Think about Facebook's News Feed feature, which debuted in 2006 and "fed" users real-time updates on what their friends were posting; whether they'd broken up with their boyfriends and if they'd added any new connections, among other notifications. Not everyone liked the intrusion. One million people joined a "Facebook News Feed protest group."  

The crux of the problem was that users didn't expect Facebook to peer so closely into their lives and display that to their online connections. Offline, we understand how much privacy we expect based on context. If I close the door to my office, everyone understands I want space and privacy. If I'm sitting at a cafe, close to a friend and chatting quietly, no one would assume it'd be fine to stand nearby and cock an ear our way. But online, that context is difficult to perceive. So when Facebook's News Feed displays broadly that I broke up with my boyfriend, it feels invasive suddenly. Just because I signed up for Facebook and clicked "yes" to whatever I had to so I could join doesn't mean I'm expecting my personal details to be treated like a news scroll on cable TV. 

In their article, “Contextual gaps: Privacy issues on Facebook,” Gordon Hull, Celine Latulipe and Heather Richter Lopfod write, "Offline, privacy is mediated by highly granular social contexts. Online contexts, including social networking sites, lack much of this granularity. These contextual gaps are at the root of many of the sites' privacy issues." 

And that's what Nissenbaum would have advised Facebook to avoid out of respect for consumer privacy. (In the end, Facebook ignored the protests and the feature lives on.) In an age when companies often seem more interested in dodging liability than gaining customer trust, Nissenbaum's theory asks companies to do better. A contextual integrity framework looks at the data-use holistically. It requires data collectors to take an ethical approach that considers societal norms, which will always evolve. It asks data collectors to look at the context of the situation to determine what that "norm" is. Is the data subject in this context a patient? An online shopper? An existing customer? Is the data collector an online retailer? A doctor? What kind of information is being collected, and how will it be used? Once you take all of those factors into account, you can start to apply what the societal "norm" would be in the context at hand. 

If Facebook looked at its plans through a contextual integrity lens, it likely would have decided the News Feed didn't align with users' expectations in joining the site and forking over their data. 

In addition to her ethical contributions to the field, Nissenbaum, a professor of information science at Cornell Tech in New York City, is also director of the Digital Life Initiative. Launched in 2017, it aims to study the "societal tensions arising from existing and emerging digital technologies. It takes into consideration ethics, policy, politics and quality of life. 

Among other accolades, Nissenbaum received the International Association for Computing and Philosophy's 2021 Covey Award for her contributions to computing, ethics and philosophy. 

Her books include Obfuscation: A User's Guide to Privacy and Protest; Values at Play in Digital Games; and Privacy in Context: Technology, Policy and the Integrity of Social Life.