‘The issue formerly known as privacy’

Control over data profiles is about power, not privacy

Editor's note: This is the eighth installment of the Living With Data series exploring how our online data is tracked, collected and used. Do you have questions about how your personal data is being used? Curious to learn more about your daily encounters with algorithms? Email the Decoder at thedecoder@aljazeera.net or submit your question via the form here. Screen shots and links are helpful clues! 

"For someone with an interest in privacy, there's certainly a lot about you online."

Someone once said that to me, and I laughed because I never said my research was about privacy. It’s a common assumption that because I’m writing about data and algorithms, I'm working on privacy.

I often share personal details in my stories to get my point across — when Netflix thinks I have children, when fitness trackers don't match my personal fitness needs, or when Facebook asks me about my fiancé. I understand the cognitive dissonance that comes from sharing these details in an article when it seems the concern is about privacy. I rarely use the word in my work or in introducing myself, yet people still categorize the set of concerns I raise as falling under the umbrella term "privacy."

As more of our lives are made legible as data and more of our experiences are processed by algorithms, I think privacy is an inexact term and doesn't fully encapsulate the range of our concerns. So if not "privacy," what could we call our concerns over data instead?

Privacy means a lot of things in a lot of contexts. For the most part, it comes out of a legal heritage. It’s everything from Justice Brandeis' 1890 concept of the "right to be let alone," to the ability to act autonomously, to control over the personal space of the home or the body to control over information in different contexts. In the Information Age and now in the realm of Big Data, it often concerns personally identifiable information or sensitive information.

Aside from these legal contexts, I think the concept of privacy makes more sense when we apply it to relationships among humans rather than a description of the concerns that surface in sociotechnical systems. 

Julia Angwin agrees that the term "privacy" isn't cutting it anymore. From The Wall Street Journal's What They Know series and now at ProPublica, she has investigated the business and technology of data and the Internet.

Her recent book “Dragnet Nation” recounts the extreme steps she took to avoid being tracked. She intervened by making choices to avoid putting her traces in the database and illustrated the futility of this exercise for normal people. As she describes, putting her cellphone in a signal-blocking security case isn’t exactly practical.

At a recent conference, Angwin reframed the subject as "the issue formerly known as privacy." Inevitably, the idea that one could completely block one’s unique identity from the system is absurd. She realized that privacy concerns are no longer about keeping information from others — a Sisyphean task in the Digital Age; it's about lacking assurances that your information won't be used against you.    

Angwin compared it to the way she thinks about getting in a car. Cars are dangerous technologies, yet she feels confident driving one because she knows they have to meet certain safety standards. And that if they don't meet those standards, she has a process of holding manufacturers of cars accountable for their safety failures.

As consumers, we have few means of holding data platforms and brokers accountable for the uses of our data. Privacy policies protect against disclosures of data to other parties and address the security of these systems against bad actors, but they don’t address how the data is being used by those with legitimate access.

The challenge that we face is this conflation of concerns. We talk about the social move to sharing check-ins and Instagram images with other people in the same breath as worrying about how Google reads our email to target ads. That conflation leads to paradoxical headlines about why "We want privacy but can't stop sharing." It all becomes a privacy issue. We’re just now starting to untangle and unpack these concerns from each other.

The questions are still often legal: What redress do we have as consumers when our data is used in discriminatory or unsavory ways?

Angwin admits that she doesn't have an alternative name for this set of concerns, and I have struggled with this for a long time too. I'm not as concerned with protecting information about myself from individuals because I trust that they will be able to understand context and make empathetic judgments about my story. I am in a position of power to tell my own story, and sharing these details in context grounds and personalizes my experience for others.

I'm more concerned with what correlating Big Data algorithms infer about me from those data points and how those inferences shape and alter my digitally mediated experience. As I wrote when Facebook started asking me about my relationships, I took issue with the fact that an algorithm was trying to understand me, make sense of me, put me into its predetermined boxes. As Michael Keller and Josh Neufeld's graphic novella describes, it's our concern with who gets to tell our story for us.

This distinction between human privacy and machine privacy is one of control and autonomy.

The concern comes down to worrying about how my data profile acts autonomously on my behalf yet I don't have a complete understanding of what my data profile looks like or how it is being used. I don't know what version of my data doppelganger is being used in any given context or what information is falsely inferred or assumed from my other behavior elsewhere.

I'm less worried about what people think of me and more concerned about assessments and judgments that are made behind the scenes, processed by algorithms and potentially affecting my experience online. That's why I get frustrated when writers conflate hand-curated personal dossiers on journalists with the activities of automated marketing activities of data broker profiles. I expect that humans will appropriately use what I've put out there and that if they use it in a social context, I can respond in kind. I trust human judgment. What I don't trust right now is machine judgment, in particular machine judgment that I have no way of knowing is going on. I think these concerns are different, and "privacy" is inadequate to describe these subtleties.

It's about autonomy and accountability for practices that make use of our data.

The Living With Data series explores how our data is tracked, collected and used online. Do you have any questions about how your personal data is being used? Curious to learn more about your daily encounters with algorithms? Email The Decoder at thedecoder@aljazeera.net, tweet me @smwat, or submit your issue or question via the form here. Screen shots and links are helpful clues!

Living with Data

Find Al Jazeera America on your TV

Get email updates from Al Jazeera America

Sign up for our weekly newsletter

Get email updates from Al Jazeera America

Sign up for our weekly newsletter