Event Report: Privacy and Civil Liberties Oversight Board

By Katelyn Anders

On November 12, 2014, the Privacy and Civil Liberties Oversight Board (PCLOB) held a public meeting with four panel discussions.  The invited panelists and board members covered the topic of “defining privacy.”  In the wake of the Snowden leaks, department store data breaches, and changes in cell phone data tracking practices, this event brought together a vast group of professionals from different fields.  The topics of privacy and technology easily melded together, but one question became very apparent that highlights the complexity of their tandem relationship:  how does technology affect our view and conceptualization of privacy? 

Complete list of panelists and topics.

Throughout this event, the panels seemed to gravitate toward this question as the line between technology and privacy has become blurred at best.

Defining Privacy

One of the biggest challenges that all disciplines face is carving out a clear definition of privacy.  This has become a two-fold task.  Now, it is not just about creating a definition that people can agree on – it is also a matter of figuring out what type of definition it should be – one of inclusion, or one that clarifies itself by what it chooses to exclude.  Most people see privacy as an inherent right, either citing the Constitution or various human rights declarations.  However, this was not the universal view on the first panel.  Paul Rosenzweig of Red Branch Consulting PLLC, was the only panelist who discussed privacy from a social standpoint – touching on its intangible value.  He said, “Privacy is invaluable because it advances other values.”  He illustrated this thought by explaining that privacy can foster democracy and honesty (for example, in the voting process).  However, he presented a balanced picture by reminding us that the idea of privacy is one that also involves shame.  His view on privacy was the most abstract, as he asked the audience to probe deeper to uncover what other values are being protected when we feel that our privacy in under siege.

Dan Solove, George Washington University Law School, explained his stance on defining privacy by expanding on a common argument that is normally presented in any privacy discussion.  He said that privacy isn’t simply about “people shouldn’t worry if they have nothing to hide.”  Instead, the entity of privacy now speaks to an entire realm of cyberspace and all of its related components - encompassing securing data, keeping accurate records, and ensuring the responsible use of that data.  On a larger scale, privacy is a public interest – not just an individual right.  This is because it does not just affect the individual; it helps to shape society as well.  Bedoya connected dots laid by Solove by adding that in today’s world, privacy is about the action of taking – not sharing.  Regardless of which viewpoint was presented, the panelists noted that defining privacy also requires us to create frameworks and/or limitations for data collection and usage. 

Privacy versus technology

As technology advances, it is starting to feel like the world has transformed into one giant outlet – able to plug into anything and connect to a limitless space of information.  Ironically, the evolution of technology does not result in a universal surge in privacy.  Instead, the use of new technology has the potential to act as a master key that can be used to penetrate most privacy measures.  At times, the words “privacy” and “technology” may feel like synonyms, but now they function more as antonyms as technology remains on its upward trajectory. 

Our advancing technology allows us to collect massive amounts of data – and this is only possible through the use of said technology.  This illustrates how privacy and technology create the perfect metaphoric double-edged sword – we can now use metadata to do a public service, such as finding criminals, but metadata opens another Pandora’s box in the privacy world.  Technology is allowing us to collect information in such high volumes that otherwise, would not have been possible.  In this sense, technology can be thought of as a bridge that facilitates a breach beyond what most people are comfortable with sharing about their lives (echoing Bedoya’s idea of “taking vs. sharing”).  If this is going to be part of our new norm, Annie Anton, Georgia Tech, encourages a push for stronger encryption methods and to use de-identification, especially if government agencies are still being told to “collect now and sort later.”  The debate on privacy is now a question of “privacy in practice.”

The Mosaic Effect & Transparency

During the panel discussions, the Mosaic Effect was mentioned multiple times.  Ed Felten, Princeton University, explained that this was when “seemingly unrelated points come together to portray a vivid picture.”  The collection of metadata makes this phenomenon a reality.  Many of the panelists discussed the issue of transparency.  It was generally agreed upon that transparency should be enhanced within the public and private sectors in order to build citizen and consumer trust.  It must occur in both sectors.  Michael Hintze of Microsoft noted that a lack of trust in the US government has caused a shift in US companies as well.

The panel that centered on government privacy officials offered insight into how certain agencies are addressing privacy concerns.  Alex Joel, Office of the Director of National Intelligence, reminded attendees of the event of a very simple, yet imperative fact:  the intelligence community was not built for transparency.  Its purpose is exactly the opposite – there is a reason why we must protect secrets.  To function efficiently, the government must be able to identify information flows and where they originate.

The last panel covered the private sector.  The panelists varied in their support of the Federal Information Processing Standards (FIPS).  Harley Geiger of the Center for Democracy & Technology said that CDT feels that the FIPS are a very important framework – urging that we cannot pick and choose which ones we want to follow.  Fred Cate, Indiana University School of Law argued that the FIPS lead us in the wrong direction.  We must think of a new reason for collection, otherwise, why are we collecting it?  Transparency acts as a system of checks and balances in the ongoing battle of privacy.  However, we are still attempting to balance the scales between the privacy of the individual with greater societal good while we are also exploring the gray area that is “taking vs. sharing” information.

This discussion will be an ongoing one, changing just as rapidly as technology itself.  This piece will be expanded into a paper addressing the same overarching question regarding technology and how it affects our notion of privacy, using Omer Tene and Jules Polonetsky’s A Theory of Creepy:  Technology, Privacy and Shifting Social Norms as a way to examine this current debate on a more in-depth interdisciplinary level.

Katelyn Anders is the Coordinator for the Cyber Security Policy and Research Institute (CSPRI) at the George Washington University.  She holds a BA in Sociology from Christopher Newport University and an MPA from American University.  She is interested in the intersection between the social sciences and cybersecurity, focusing on how the cultural values of social media affect views on privacy and security.