Dear Jeff and Declan:
I much enjoyed both of your posts, and I must say you both paint tempting pictures. Jeff, the way you describe it, it might seem to some (though your book is often far more cautious than this) that much privacy-destroying investigation is actually useless to the investigators themselves. Had Starr not probed so deeply into Lewinsky’s life, he wouldn’t have been misled by his “confus[ing information] with knowledge.” Tearing away our masks isn’t just cruel but also doesn’t really show anyone our “true sel[ves].” So we can have privacy and better decisionmaking, too–such a deal!
But I’m not sure I can be so sanguine. Life is a long string of decisions based on necessarily imperfect information. Do we trust someone in business? Should we send our child to a particular preschool? Is this witness telling the truth? Should we support this candidate or this judicial nominee?
Virtually none of these decisions can ever be based on a full understanding of people “in all of their complicated dimensions.” Some of the decisions will be rich in some kinds of context, but of course all will always omit something. We don’t fully know ourselves, much less anyone else!
And yet decide we must; and deciding more intelligently often requires more information. More information won’t always help us, and will sometimes mislead us. But I’m skeptical that you or I or Louis Brandeis can really reliably decide on others’ behalf (even on Ken Starr’s behalf!) which information is likely to delude and which to enlighten.
Privacy, then, comes at a cost to others’ accurate and sensible decision-making, and, as I’ll mention shortly, sometimes even to their free speech. Some privacy rules (for instance, the Fourth Amendment and trespass and wiretap laws) are certainly good, but not because they prevent misjudgment–they’re good even though they sometimes interfere with others’ judgment.
Declan, your technological solution also seems quite appealing: If we can have the perfect technological privacy shield, why worry about often-misguided legal and social rules? But there, too, I’m not sure I can be that sanguine. I think Jeff is right that many privacy technologies, at least today, are too cumbersome or at least too little known to 95 percent of the public, and may require quite substantial reorganization of their users’ lives.
Today, if I want to buy something online, I’ll probably have to give the seller my name and credit card number. If I want to have normal, easy e-mail exchanges with most of my friends, the messages will probably be stored somewhere in clear text. I could try to reorganize my life to minimize this exposure, but it won’t be easy. Maybe there’s not much of a problem to be solved–maybe we shouldn’t worry that people know things about us (hey, we’d often like to know more about them)–but those who do worry are right not to be entirely comforted by the techno-fixes.
It seems to me, then, that this leaves us with the inevitability that 1) many who care a lot about privacy will demand coercive, legal solutions and not just technological protections, and 2) these solutions will have significant costs to liberty, safety, and privacy itself. And this brings me to a few questions addressed mainly to Jeff.
Jeff, despite your strong belief in free speech, and despite your general skepticism toward governmental solutions, even you suggest that “maybe we need new Brandeis torts”–I take it that means new restrictions on what people may say about other people–“for the 21st century.” It’s no surprise that many others who lack your free-speech libertarianism are likewise calling for such restrictions. And this is especially so because our legal system has recently become much more concerned (partly correctly but often excessively) with preventing supposed indignities and punishing supposed misjudgments.
Are you worried about the possible scope of such torts? More broadly, are you worried that we might in the name of privacy begin to lose our broader liberty–our necessary liberty to make educated guesses about people, to act based on limited evidence, to share snippets of information, to talk about people we’ve met or done business with, to risk misjudging? What would you do to make sure that privacy law doesn’t overreach the same way that harassment law or discovery law has?