Stop Saying Privacy Is Dead

Our lives are still rich in personal privacy — and we should fight to keep it that way

Privacy protections are at risk yet again. The Five Eyes security alliance, which spans the governments of the United States, United Kingdom, Australia, Canada, and New Zealand, has issued a “Statement of Principles on Access to Evidence and Encryption,” which suggests that tech companies will face strong opposition if they don’t provide law enforcement with backdoors for encrypted communication. Executives from big tech companies are being hauled in front of Congress to answer for our jeopardized personal data. Facial recognition is running amok. Privacy keeps being assaulted, with seemingly no end in sight.

The landscape looks so bleak that it may feel like every shred of privacy we’ve ever had is gone or is guaranteed to go by the wayside. “Privacy doesn’t exist in a post-Facebook crisis era,” Cambridge Analytica whistleblower Brittany Kaiser declared, suggesting that as a consolation prize we should at least be able to make money selling data currently being taken from us. Media studies professor Ian Bogost says our chances of opting out of surveillance capitalismare so poor that he declares “the age of privacy nihilism is here.” Almost likening privacy invasions to a state of perpetual war, he predicts endless agony: “Everything you have done has been recorded, munged, and spat back at you to benefit sellers, advertisers, and the brokers who service them. It has been for a long time, and it’s not going to stop.”

Privacy fatalism might feel like a digital-age malady, but the underlying sentiment is actually old hat. Back in 1890, Louis Brandeis and Samuel Warren were so worried about early Kodak cameras that they wrote their famous law review article, “The Right to Privacy.” By the 1940s, critics had begun to ask whether privacy was dead, and by the 1950s and ’60s it was anxiously repeated until it turned into a meme. It’s tragicomical how oftenfolks announce privacy’s time of death and suggest that everything good about it is already lost.

When you hear someone say “privacy,” you should think “negotiate.”

Privacy’s pallbearers have reasons for being so grim, but they’re burdened by the weight of an empty coffin. Privacy may be ailing, but it’s not dead yet.

In our own work, we’ve identified and criticized many different and dangerous ways that industry and government are collecting, using, and sharing our personal information, and this is largely due to infrastructure that has been created for the purpose of exploiting our data. But enough of our privacy still exists that, with dedication, we could both protect and embolden it.

Don’t Be Despondent — Your Privacy Matters

As privacy scholar Josh Fairfield says, while some dismiss privacy concerns by saying they have nothing to hide, we shouldn’t accept that argument from anyone wearing clothes. Or anyone who closes the bathroom door, locks her home or car, or uses password-protected accounts. Or anyone who benefits from rules and norms that protect secrecy and confidentiality, prohibit government overreach, and give us recourse if others intrude upon our seclusion, publicly disclose embarrassing private facts, depict us in a false light, or appropriate our image or likeness. We still trust our doctors. We still bare everything for our intimate partners. It is still hard to pick us out of a crowd and identify us in many different kinds of data sets.

These examples and countless others suggest that, first, we haven’t yet entered an age of complete, ubiquitous, and irreversible transparency. Second, the very question of whether privacy is dead is a poorly worded query that trades upon an inappropriate binary metaphor to create an emotionally powerful, rhetorical effect. To ask if privacy died is to wonder whether something that once existed in the world, a felt and shared presence, has gone away and can only be memorialized by pining for its absence. But privacy has never been a single thing, a static object that can be lost.

To view privacy as a thing is to make what philosophers call a “category mistake.” It’s an error in reasoning that uses the wrong language to describe the situation. It’s also an ontological mistake that attributes the wrong properties to what is being described and evaluated. What scholars call “informational privacy,” the ability to influence how our personal information is collected, used, and disclosed, is a verb, not a noun. When you hear someone say “privacy” in this context, you should think “negotiate.” That is, you should think of an action word that emphasizes process.

Informational privacy is a tenuous, revisable, ongoing discussion expressed in debates between individuals, groups, and institutions about how to set and enforce norms. Privacy negotiations are hard and sometimes unfair, because they often don’t involve all the people who should be entitled to take part; facial recognition is flourishing under frameworks created without the input of advocacy groups or underrepresented and vulnerable populations. Representation has a profound difference on how privacy policies get set, when they get enforced, and when exceptions get made.

Even design itself is a tool of power. Technology companies have weaponized user-experience design, exploiting our cognitive limitations and emotional drives to manipulate us into disclosing sensitive personal information — all while being able to say, in a court of law, that we are voluntarily making these choices and agreeing to explicitly presented user agreements.

To give another example, because slippery slopes are real, simply adding surveillance infrastructure (more cameras, sensors, and software) can tilt the negotiating process in favor of expanding the reach of corporate and government surveillance. We’re the last people to say that, ordinarily, privacy negotiation is conducted in a respectful fashion among equals. Negotiation can be a downright abusive process that leaves too many of us feeling vulnerable, if not paranoid, that our every move is being watched.

Yet even if personal control is being eroded, as long as you are able to create and maintain relationships of trust, privacy does exist. In some U.S. states, you can’t usually record people’s conversations without asking for their permission, and a few states have similar rules for the use of facial recognition.

As long you have some meaningful say over when you are watched and can exert agency over how your data is processed, you will have some modicum of privacy.

We’re emphasizing the importance of privacy as a series of living negotiations because we’re worried that low expectations will lead to folks prematurely giving up on battles they should be ready to fight. Even if the phrase “privacy is dead” is meant figuratively rather than literally and is used to express justifiable frustration, anger, and sadness about how unevenly power is distributed, it’s still dangerous terminology. As a framing device, it nudges us to abdicate agency. We need to see privacy as having a future worth fighting for.

Privacy considerations should be woven into everything — including creating appropriate rules for collecting, storing, processing, disseminating, and designing. Privacy law, policy, and norms should continue to revolve around preserving traditional limitations on when we can be watched and what data can be collected — including in cutting-edge contexts involving robotics and artificial intelligence. But it should also evolve to meet new challenges. We need to fight for rules that prevent companies from tricking us into thinking we have control over our data; despite copious toggle switches and privacy policies, few of us understand Facebook’s fatally opaque data ecosystem. And we should also demand fair and accountable automated decisions where appropriate.

Law professor Margot Kaminski even suggests that recent events give us reason to be cautiously optimistic, citing Europe’s General Data Protection Regulation (GDPR), California’s Consumer Privacy Act, and the U.S. Supreme Court ruling in Carpenter vs. United States, which grants privacy protections for mobile phone location data.

“There is a growing transatlantic consensus emerging on privacy in the digital age,” Kaminski writes. “Sharing data no longer obviates privacy. Privacy protections now increasingly travel with personal information, even if that information is something a company has inferred rather than collected. Both legal systems also increasingly recognize that privacy is, perhaps counterintuitively, deeply linked to transparency: people cannot exert control or request remedies if they do not know where their information is going.

“Perhaps most significantly, both legal regimes now exhibit a growing awareness of how linked privacy is to other well-recognised legal harms such as chilling effects on free expression or discrimination against individuals… The age of free data and diminishing data privacy looks to be rapidly ending.”

Why Obscurity Is So Critical to Privacy

For the time being, we actually enjoy an abundance of privacy in often surprising and overlooked forms, including what we say in public, where we live, and what we throw away. The key is understanding a conceptualization of privacy that we might intuitively recognize but have trouble articulating: We live and flourish in huge patches of obscurity.

Obscurity has a specific meaning within privacy theory. In our definition, it exists on a spectrum and revolves around the accessibility and interpretation of our personal information.

The easier it is to find information, the less obscure it is, and the harder it is to locate information, the more obscure it is. Information only has value, however, if it can be properly interpreted. If you overhear someone saying, “It’s positive,” but don’t know they are talking about a pregnancy test, the information doesn’t reveal anything. So the easier it is to make information comprehensible, the less obscure it is, and vice versa.

In short, when knowledge about us is costly, people are able to enjoy a sense of privacy.

Traditionally, neither individuals nor businesses had the motivation or resources to dig into our private lives. A few people may have known your daily whereabouts, but unless you were the subject of a criminal investigation, no one was likely to follow you around all day. Today, transaction costs for acquiring and aggregating all kinds of information are dropping like crazy, and people are voluntarily disclosing everything from the profound to the banal across social media. Search engines are such powerful information-retrieval tools that people get annoyed if you ask them a question you could have Googled instead. Indeed, the search engine is so good at locating the information you’re looking for that autocomplete can feel like telepathy, uncannily reading your mind.

Even as technology erodes our obscurity, barriers remain. Transaction costs and practical considerations still protect us. You may not realize it, but we all rely on obscurity even now to adjust our risk calculus as we decide how to live our lives. It’s time we better appreciated the obscurity we have, how it is critical to human flourishing, and the forces chipping away at it.

What We Look Like

Yes, your face is one of the most public things about you. But in the language of obscurity, it is still a bit of a mystery, and that’s a good thing.

Over the summer, debates about facial recognition technology heated up with Microsoft president Brad Smith calling for the government to regulate it. While facial recognition technology is far from perfect, social technology researcher Judith Donath predicts that in 10 years, it will play a major role in changing social norms. As Donath sees it, a perfect storm consisting of fear of dangerous people, the desire for frictionless social connection, and consumer demand for convenience will help facial recognition technology get to the point where “when you walk down the street or you sit in a restaurant or you’re at a party, [it] will give you the ability to identify the people around you.”

If technology enables people to immediately identify those around them by face, obscurity will take a dramatic hit. “We will be in a situation where if we had our facial recognition abilities taken away from us, it would be creepy to be out in public with everyone as a total stranger whom you know nothing about,” Donath says. A likely consequence of society being reengineered this way, she suggests, is that an environment of pervasive chill will emerge that is conducive to both mass conformity and paranoia.

We need to take action, Donath says, to prevent facial recognition technology from becoming normalized as a tool that is appropriate to use everywhere and anywhere. An ideal place to start is to oppose schools adopting facial recognition technology in their efforts to improve security.

In Niagara County, New York, the Lockport City School District has spent millions from a state “Smart School” grant to roll out facial recognition technology for safety systems, including for the threat of gun violence. As the Intercept clarifies, the approach seems doomed to be “inefficient and expensive.” It seems unlikely the system could identify individuals who hadn’t offended before or predict which students might be at risk of losing control.

According to Toni Smith-Thompson, an organizer at the New York Civil Liberties Union, schools that embrace facial recognition technology for surveillance purposes will see the essential character of the scholastic environment change. “Normalizing mechanisms of surveillance and control catalyzes the criminalization of the school environment and could make school hallways feel more like jails. It facilitates the tracking of everyone’s movements and social interactions and reinforces the school-to-prison pipeline.”

If society writ large wants to protect the obscurity of faces, a key task is to figure out how to solve urgent public problems without embracing the false panacea of ubiquitous facial recognition technology.

What We Say

Few things help people bond more quickly than gossip. We close our doors, speak in hushed tones, and check over our shoulder to make sure what we’re about to say is unlikely to be overheard or understood by others. We gossip over dinner, at sporting events, and in a host of other contexts in plain view and within earshot, yet we still expect a certain amount of obscurity.

We’re able to intimately gossip in public and receive the pro-social benefitsfrom doing so because we know people can pay only so much attention to what’s going on around them and that we’re probably not interesting enough for others to spend their limited bandwidth deciphering us.

Let’s do a quick obscurity test. Try to recall the last time you ate a restaurant. It’s open air, full of bustle and chatter — all signals of what we think of as being “in public.” Now try to remember what the people sitting closest to you were talking about. Can you even remember what they looked like? Their faces and conversation are likely as obscure to you as you were to them.

We interact within zones of obscurity that make our lives bearable, crossing into each other’s spaces without actually “seeing” or “knowing” each other. This is an invaluable everyday privacy that we too often take for granted — and another reason we should resist the spread of facial recognition technology.

For now, even in the world of iPhones and social media, we still have mostly analog, face-to-face conversations. Even some technologically aided communications, like talking on the phone or using video conferencing, have ephemeral qualities. They only imperfectly reside in the memories of a few people, making recall costly. Much of this protection is the result of biology. Obscurity is not just desirable—it’s physically essential.

Evolutionary psychologist Robin Dunbar argues that this is because our brains can handle only limited cognitive loads, so we tend to ignore or fail to store large amounts of information. In other words, we naturally make other people and things obscure to us. To prevent the overburdening of memory, our brains automatically limit our cognitive groups to a manageable size. (Dunbar’s theory is that people can actively share communication, memories, and interpersonal relationships with, at most, 150 people — colloquially referred to as our “Dunbar number.”) Accordingly, most interactions outside our cognitive groups are unlikely to be recalled again, unless they’re technologically preserved and made searchable and accessible. Obscurity is our natural and default state of privacy.

What We Buy

Perhaps one of the more underappreciated aspects of cash is that it doesn’t leave a trail. Your purchasing habits can reveal a great deal about you. What we buy at the pharmacy can reveal our health conditions. The books we buycan reveal our tastes, even our fetishes.

It’s one thing for online shopping platforms like Amazon or for credit card companies to keep data about our shopping. When these companies remain trustworthy — that is, discreet, honest, protective, and loyal — our purchase data stays relatively safe and obscure.

But it’s another thing entirely for companies to broadcast our purchases. When the PayPal-owned payment service Venmo designed its app, it was optimized to be like a social network, expecting users to find it “fun” to share their financial information with friends and family. User transactions were public by default, thus engineering a significant loss of obscurity by announcing how its users are spending their money. Researcher Hang Do Thi Duc analyzed more than 200 million public Venmo transactions made in 2017. On her project website, “Public by Default,”she “honed in on five individual users, including a man who sells cannabis in Santa Barbara and a pair of lovers who pass money between each other accompanied by flirting, arguing, apologies, and threats.”

The default settings of software are notoriously sticky. Even though the cost of changing them is minimal, doing so requires enough time, forethought, and effort that people often leave things as they are. In this way, design picks data winners and obscurity losers.

Thankfully, most of our financial transactions are not as public as Venmo’s default and remain obscure and entrusted. Unfortunately, there’s also a growing wave of stores refusing to take cash. Don’t even get us started about the madness that is “paying with your face.” If we value obscurity, we should better cherish the privacy virtues of the dollar, and choosing to pay in cash can give new meaning to conscientious capitalism.

Who We Know

Who we interact with — what tech calls our “social graph” — reveals so much about us. It’s the key to seeing how data flows between people and accounts and why the data social media companies own is so valuable. Parts of our social graph are filled in when we chat with people online, “follow” them, or “friend” them. Indeed, social media is dramatically reducing the transaction costs for other people and companies to know who our friends are.

Journalist Kashmir Hill has written several stories investigating Facebook’s notorious “people you may know” feature, software that suggests “friends” for users based on opaque sources of data. The feature is mining every dark corner of Facebook’s data troves, and it’s constantly seeking more ways to learn more about you to fill in the remaining gaps.

Thankfully, our full, real social graph is naturally obscure. Without technical intervention, people can only paint in the corners of our friend-and-acquaintance mural. Only we know the whole picture—for now, at least. That’s what makes Facebook’s attempt to map our network so noxious. It removes our agency and bypasses the practical protection that we traditionally rely upon to protect something as intimate and conceptual as who we “know.” Social interaction is hard enough without having to worry about our casual relationships being quantified and mapped — and the last thing we need is another excuse to avoid meaningfully engaging with people.

Who We Are

One of the most frightening aspects of facial recognition technology is that it can remotely capture a biometric in bulk. DNA is slower to acquire, but it can be an immensely revealing source of information — so much so that it was the DNA of family members that helped police identify the culprit in the highly publicized case of the Golden State Killer.

Consumer genetic testing kits are relatively cheap, easy to purchase, easy to use, and marketed in consumer-friendly terms that avoid mentioning these kinds of ramifications. Consequently, these DNA databases are expanding significantly, and the overall obscurity of DNA is diminishing accordingly.

It’s not surprising that people freaked out when GlaxoSmithKline Plc bought a stake of 23andMe for $300 million — a deal that led customers to wonder who, ultimately, would end up with access to their genetic information. While this situation would have been a perfect catalyst for dystopian surveillance if privacy truly was dead, some progress in the right direction was made as a result.

The Future of Privacy Forum (an organization we are both affiliated with), along with a number of powerful genomic testing companies, including 23andMe, released “Privacy Best Practices for Consumer Genetic Testing Services.” These guidelines emphasized protections such as transparency, consent, security, marketing restrictions, and “access, correction, and deletion rights.” Of course, these guidelines aren’t a comprehensive solution — just a first step toward a commitment to legal protections.

Critical questions still need to be resolved in the consumer genetic testing space, including how information classified as research data will be anonymized and whether voluntary self-regulation can be accepted in the diverse direct-to-consumer genetic testing industry. We’re highly skeptical of leaving this to self-regulation, though at least the relatively high cost of deciphering DNA means this data is still fairly obscure. But meaningful efforts in the right direction are worth taking when we can get them, while continuing to aspire for more.

It’s easy to read the news and feel hopeless about the future of privacy. We must admit that even we have felt a little nihilistic about it at times. Are companies and governments generally doing everything they can to collect and quantify every single bit of information about us? Yes, they are, because they can’t resist the allure of knowing everything.

That by itself doesn’t mean privacy is dead. Surveillance and data processing efforts may be extreme, but they’re not perfect. The relentless fixation on acquiring our information demonstrates what’s at stake: the invaluable barriers and costs that make information hard or unlikely to be found or understood. These costs still exist in spades. You can recognize them every time you have to reach into your pocket to use your phone to take a picture, ask someone the name of a person who walked into the room, or fail to accurately recall a conversation you were sure you’d never forget.

If we fully appreciate the immense value these costs bring, we just might find our privacy salvation.

Comments

comments