Privacy and Policy

A simple analysis

Robin Hanson

Ordinary privacy is leaving; can online privacy replace it?
Typical levels and types of privacy may change a great deal over the coming decades. Many people say we'll have much more privacy. Some say this is good, and some say this is bad. Many others say we'll have much less privacy, with some saying this is good and some saying this is bad. Who should we believe?

For example, law enforcement officials have asked for strict limitations on new technologies for encrypting phone and other electronic communications, since this threatens police abilities to tap the conversations of suspected criminals. (I have elsewhere argued that this probably isn't a big loss; in 1993 only $52 million was spent on legal U.S. police wiretaps.) Others complain that copyright law, speech prohibitions, and income taxes will be come unenforceable. Still others agree, but celebrate such changes.

Going the other direction, David Brin predicts and argues for a transparent society. His summary argument is that since cameras will soon be everywhere except in the police stations, our only choice is whether we want cameras there as well. And he thinks shining light on police operations is good, since it makes them more open to criticism.

If Brin were just talking about police stations, I'd be inclined to agree. But to Brin, the police are just one example of the rich and powerful, all of whom need criticism. In his book, Brin explains that he wants to shine light on all powerful people, in industry, entertainment, academia, the military, and anywhere else that matters. And to make sure he gets them all, he'd rather all our lives were mostly transparent. So, for example, he leans toward strong limits on technology which enables privacy, rather than risk letting the powerful keep secrets.

If we instead hold constant some level of privacy for the powerful, and ask how much privacy we want for peons, the privacy question looks very different. Which just emphasizes how crucial it is to think carefully about both what exactly are the likely consequences of current trends, and what exactly our choices might be. Then perhaps we can think carefully about what direction we might want to push change on the margin.

I'm no expert on this topic, but since I haven't seen anything similar, I offer the following simple glib qualitative analysis of the basic issues. I really hope someone can do a more careful analysis.

Baseline Projections

Let's first consider some baseline scenarios of how privacy is likely to change over the next century. I find a useful starting point for such analysis to be Mark Miller's concept of thresholding privacy, the essence of which I take to be increased variance in privacy levels. As the technologies of privacy and of privacy-invasion become more sophisticated, we'll have a lot more privacy in some areas, and a lot less privacy in others. In particular, we're likely to have less privacy in our ordinary physical lives, and more privacy when exchanging bits in ways that are uncorrelated with our ordinary lives.

Ordinary Physical Privacy

In the ordinary physical world, straightforward projections suggest we are likely to get much less privacy. As cameras, microphones, and other detectors proliferate in our public spaces, most behavior in such spaces will become a matter of public record. Your car will be tracked as it moves on the streets, your body will be tracked as you walk on sidewalks and through stores, your clothing and style of walking will be noted, and your conversations may even be overheard. There will likely be cameras on each building, lamppost, car, and even on each person. With enough independent sources documenting everything, it should be easy to find some source willing to cheaply tell you what happened, and it should be hard to lie about such things.

Private spaces may have rules prohibiting such cameras. But it's not clear how enforceable such rules will be, especially if people come to rely on having electronic aids close at hand. Nor is it clear how willing people will be to leave their recorders outside or to turn them off if they fear other recorders will slip through security. It may be that most casual gatherings of four or more people will be documented, with conversation transcripts cheaply available. (One way to buy them cheaply is to pay only the first to anonymously reveal, and to commit to slowly raising the price with time to very high levels.)

One expects information about a person to be aggregated into reliable published summaries describing each person's home, place of work, likely income, observed physical items owned, and their favored styles of clothing, recreation, conversation, and travel. There should also be summaries of who has been seen in public with whom, allowing one to infer friends, business associates, and lovers. Records should also indicate one's tendencies for loud arguments or fights.

The ease of posting information about yourself and associates also creates the possibility of social pressure to post such information. If most people posted their school grades, and you didn't, people might presume your grades were bad. Similarly, if you didn't post a favorable review of a long-time acquaintance, people might presume you didn't think much of that person. In the extreme, people might post all verifiable information about themselves, and evaluate everyone and everything they come into substantial contact with.

Having highly transparent ordinary physical lives raises concerns about stronger social pressures for various forms of conformity. Imagine, to take a silly example, that most people gave in to social pressures to wear hats, and that the people who resisted and didn't wear hats tended to be "nut cases" that most other people didn't want to associate with for other reasons. In this case wide conformity to hat wearing can be a stable equilibrium even if most people privately thought that wearing hats was stupid. (The wearing of business suits now may be a similar situation.) Such conformity traps are less likely, however, when there are many smart attractive people who enjoy bucking conventions.

Privacy on the Net

On the net, straightforward projections suggest that people may have much more privacy than they do now, at least if they act in certain ways. Strong cryptography seems likely to let two people sitting in their special secure private spaces talk with each other privately, and hand each other untraceable cash. The main risks of lost privacy would then come from physical bugs in either private space, or from what either side might share with third parties about the conversation. With such privacy, people could thus trade bits for money, even in violation of laws, and worry mainly about the trustworthiness of the other party.

It also seems possible, though more difficult, for people to maintain distinct electronic identities, or "nyms." If it were very hard to match such nyms with the well-documented "true names" behind them, people might set up businesses on the net to buy and sell bits, without even needing to trust the nyms they traded with. Such nyms might then comfortably ignore government regulations and taxes without fear of personal reprisal. They might post slander and political heresy, resell copyrighted works for cheap, offer their services as hit men, or avoid income taxes on their services as a software engineer or virtual doctor.

The "true name" behind a nym could be revealed if one could see where the bits sent by that nym physically came from. To prevent this, messages can be sent through many "mixes." Each mix takes a bunch of incoming messages, reformats them, and sends them out again in a way that an observer can not tell which message was which. Privacy is maintained if at least one of the many mixes used works. It is not clear, however, whether many good mixes will actually be allowed to function.

Another difficulty is that true names and nyms can be matched if there is any substantial correlation between their actions. You can't hope to get away with a tax-free business as a virtual reality doctor if there is any substantial correlation between your real and nym face, style of speech, walking gait, preferred vocabulary, or jokes and anecdotes told. You can't even let there be any substantial correlation between when you appear physically in public and when you don't appear at your virtual business. Sick-days, vacations, and net-down times would all be troubling. And you must never meet business colleagues in person.

Correlations are minimized with short-lived nyms, who communicate via low-res standard forms with a substantial time delay. The loss from having a nym revealed may also be smaller for short-lived nyms. And high-spending people who spend most of their waking hours in their personal secure net room may reasonably be suspected of income tax evasion. Thus simple standardized arms-length low-res short-lived part-time nym relations are preferred. So software engineers doing well-defined month-long part-time go-away-and-do-it tasks may work well, but high-res full-time virtual-reality doctors who build up many close working relationships with colleagues over many years may not.

Overall Privacy

It seems to me that to most people, for most purposes, the added costs of interacting via uncorrelated nyms is just not worth the benefits of increased privacy, reduced taxes, and freedom from regulation. Humans like having long-term rich physical interactions with other people, on the job and off, and they find it very hard to consistently manage rich uncorrelated persona. (Good acting is very hard, after all.) Even now, people seem to be willing to give up substantial privacy in exchange for better targeted product marketing.

For example, what's the point of buying a book via a nym, and then not being able to talk about the book with friends and coworkers? If you post a book review to the web, you want people's favorable opinion of that review to add to your true name's reputation, and you want your true name's reputation to entice people to read your review. You'd like the freedom to mention other books you've read in your review, and even to relate the book to your personal life. You want people who liked your review to be able to learn more about you. And letting marketers know you liked a book helps them in suggesting to you other books you might like.

Some people have suggested that nyms can manage these sorts of cross-connections via various specialized credentials, such as an "is a good book reviewer" credential. I find this quite implausible. Consider the example of job market resumes. You might try to maintain privacy about your previous education and work by using "went to a good school" and "was a good software engineer" credentials, but there is so much variation about what such things could mean that most employers will usually insist on seeing details about what school you went to, and what kind of software you engineered.

The bottom line is that it seems to me quite likely that the overall trend is toward a lot less privacy for most people. Yes, there may be important minorities who manage to avoid taxes, and most people may regularly use nyms for some limited purposes, such as avoiding political censure. But most people will likely spend most of their time relating to each other as very well documented true names.

Policy Implications

Reduced overall privacy suggests that we be less concerned about potential problems with increased privacy in some areas. Yes, it may be easier to anonymously make an extortion threat and to collect the cash, but it should be a lot harder to actually carry out the threat undetected. Yes, people will try to resell copyrighted works, but if sellers refused to sell to nyms, illicit buyers would have to enjoy these works in private, without sharing them with others as a true name. And cheats must worry about hidden tags that identify who legitimately bought a particular copy.

Reduced overall privacy also suggests that we be less concerned with the benefits of increased openness than with the costs of reduced privacy. Yes, openness can subject the powerful to needed criticism, and information economists like myself understand how secrets are actually the main cause of economic inefficiency and social injustice in the world. Increased openness may be an incredible boon to humanity. But we may still be reasonably concerned that certain losses of privacy will cause more harm than good. Do we really want to risk losing all privacy?

To the extent that we want to work to change this baseline scenario on the margin, our ignorance of how this would all work out should make us cautious, preferring mild incremental interventions which are the least likely to preclude clear direct benefits or further experimentation. For example, we might try to save copyright by just increasing the legal penalties against copyright violation, and the bounties offered to those who catch violators. (A death penalty and million dollar bounty would probably be enough; let's hope it doesn't come to that.) And if wiretaps become infeasible, we can substitute other methods of law enforcement.

On the other side, before we consider crude hard to enforce rules prohibiting the collection or use of information, let's see how far we can get by freeing people to invent better ways to protect their privacy. Why not eliminate rules against private cryptography? Why not allow cars without license plates and with highly tinted windows? Why not allow private spaces the maximum freedom in experimenting with rules and penalties regarding on-site surveillance? Maybe we should give people stronger means to commit not to reveal information about themselves. We might even promote the development of cheap physical package mixes, to make it harder to see who sent what physical package. And maybe we should develop better ways to directly identify the "nut cases" that people want to avoid, to reduce the potential for social conformity traps.

Privacy is a very important issue, but we are vastly ignorant about its costs and benefits, and about the specific privacy consequences of new technologies. So let's try to think more carefully about this, but let's also try not to overreact to each new change. Humans are remarkably adaptable, if they are given the room to experiment.