In 2010, Mark Zuckerberg famously said “the age of privacy is over”. (Kirkpatrick, 2010) He’s walked back that statement in the ensuing years. However, there is a sense that with the proliferation of social media and the extent to which we are sharing information online, that we’ve evolved into a society that doesn’t value privacy as much as past generations.
At the same time, we also proclaim that privacy is important and should be protected. Regulators are looking towards privacy legislation like the European Union’s General Data Protection Regulation (GDPR) or Canada’s proposed Bill C-11, the Digital Charter Implementation Act, as a means of governing the digital sphere. Much of the focus on enhanced privacy regulations is being driven by concerns over the vastly increased gathering and processing of data to fuel artificial intelligence. This enhanced level of data use is expected to grow exponentially as we move towards an Internet of Things and increase our deployment of AI across a wide range of spheres.
When it comes to our data, our words and actions seem out of sync. Do we actually value privacy?
Published in 2010, Helen Nissenbaum's Privacy in Context: Technology, Policy and the Integrity of Social Life, attempts to address this issue by presenting a case for privacy that breaks with the traditional view of access and control. Instead, she lays out the concept of contextual integrity, a way of thinking about privacy that involves social contexts and norms which govern the appropriate information flow within those contexts. Nissenbaum explains that our sense of privacy is violated when there is a break with these normative values.
The book is presented in three parts. The first part focuses on technology and maps out various socio-technical systems. A wide range of technologies are touched on, from CCTV to RFID to online monitoring to sensors. Nissenbaum connects these surveillance technologies that gather information to databases that have the capacity to aggregate and store vast amounts of information. Aggregated data can then be amplified or widely shared via the internet and social media, both of which also play a further role in surveillance. One might imagine this process as an iterative loop, rather than a linear process. This system of data gathering, aggregation and amplification reshapes our relationship with information in ways that challenge privacy.
Part two turns to privacy, first establishing why privacy is important, not only for individuals but also for society, and how it co-locates with other values such as autonomy, justice and equality. Nissenbaum then explains why the dominant mode of thinking about privacy in terms of accessing information or controlling access to information, is insufficient. It applies a binary filter that lacks the finer grained socio-cultural details. Instead, she seeks to find “the sources for a right to privacy in the framework of moral and political values.” (Nissenbaum, 2010 p 72) Thus, while policy makers spend time defining and arguing over the details surrounding access and control of information, Nissenbaum feels we are missing the bigger point, that we do not need to choose between access or control, because there is a place for both. (Nissenbaum, 2010, p 147)
Finally, in part three, Nissenbaum unpacks the framework for contextual integrity. The framework uses the “key parameters of context, actors, attributes and transmission principles” in order to assess whether or not “context-relative informational norms” have been violated. (Nissenbaum, p 148/149) We have a right to live in a world where a “reasonable expectation of privacy” is upheld and expectations around the flow of personal information are met. (Nissenbaum, 2010, p 233) These expectations are governed by a set of “context relative informational norms”, tying privacy to a situation. (Nissenbaum 2010, p 232) This process is dynamic, evolving and negotiated over time, rather than being static and fixed. The theory by itself seems abstract, but concrete examples help ground the concepts in familiar territory.
For example, looking at a health care scenario, let’s say we are experiencing a health issue (context), we pay a visit to our doctor (actor) and we have expectations around that interaction with respect to how information will be communicated, stored or shared. Pre-COVID, this may have been an in person visit, but now it may take place online.
Nissenbaum would have us examine the attributes that are affected by this change. For example, are new actors introduced into the relationship, such as the managers of the technology used to facilitate the online transaction? This can broaden the circle of who may have access to information. Is there new information being gathered as a result of this change to an online platform? For example, online visits may be gathering metadata about the patient’s location, device used to connect, the specific length of visit and other details that might not necessarily form part of an in person visit. All of this may be stored, aggregated and potentially shared with additional new actors, for purposes beyond resolving the health issue at hand, which may signal a change of context.
If these “new practices generate changes in actors, attributes or transmission principles, the practice is flagged” as a potential violation. (Nissenbaum, p 150) From there, the situation would need to be negotiated against existing societal norms (expectations) to assess whether society will change it’s thinking given the new context. Conversely, we may feel that traditional norms should prevail. In the case of virtual care, the wider context of a global pandemic has impacted societal norms. This kind of social background information is pertinent to the contextual analysis and negotiation process.
Nissenbaum’s work resonates with me. Reading it a decade after it was written, I’m curious as to how we might progress in applying contextual integrity. When I think about the GDPR after reading this book, it feels like a blunt instrument being wielded to manage the flow of information access and control. Nissenbaum calls it, an omnibus approach. Contextual integrity presents a more nuanced, layered, evaluation of privacy, however, that seems hard to implement, practically speaking. Perhaps this is why contextual integrity doesn’t seem to have gained much traction beyond academic circles, yet.
As the COVID-19 pandemic rolls on and as our reliance on technology in every sphere increases, including the increased deployment of artificial intelligence enabled technology, we are being pushed to address these tough questions. When it comes to privacy, what values do we wish to protect and uphold in a particular situation and what will we let go of to embrace a new set of normative informational behaviours? Nissenbaum’s theory of contextual integrity gives us a framework to do the work. However, the process will take time and, as Nissenbaum acknowledges, the process may itself evolve in practice. This may be the kind of work that is measured in decades and the results not clearly understood until we look back upon it retrospectively.
By Katrina Ingram, CEO, Ethically Aligned AI _______
Sign up for our newsletter to have new blog posts and other updates delivered to you each month!
Ethically Aligned AI is a social enterprise aimed at helping organizations make better choices about designing and deploying technology. Find out more at ethicallyalignedai.com © 2021 Ethically Aligned AI Inc. All right reserved.
#DataPrivacy #SurveillanceCapitalism #COVID
Kirkpatrick, M. (2010, January 10). Facebook’s Zuckerburg Says The Age of Privacy Is Over. New York Times. Retrieved from - https://archive.nytimes.com/www.nytimes.com/external/readwriteweb/2010/01/10/10readwriteweb-facebooks-zuckerberg-says-the-age-of-privac-82963.html
Nissenbaum, H. (2010). Privacy in Context: Technology, Policy and the Integrity of Social Life. Stanford University Press. Stanford, CA.