Thanks to the real state website Zillow, it’s now super easy to profit from your neighbor’s suffering. With a few easy clicks, you can find out “if a homeowner has defaulted on the mortgage and by how much, whether a house has been taken back by the lender, and what a house might sell for in foreclosure,” as the Los Angeles Times recently reported. After using the service, you can stop by the Johnsons’ to make them a low-ball offer, perhaps sweetening the exploitation with a plate of cookies.
Maybe that’s not fair. Zillow doesn’t let people opt-out, but the company omits borrowers’ names, has a process for correcting mistakes, and uploads only legal information that was previously—albeit inconveniently—available.
Zillow is cutting the cord of time-consuming real estate bureaucracy, but it’s just part of a larger story presciently described in a 2007 SMU Law Review article by University of Colorado Law School professor Harry Surden. According to Surden, big data networks persistently chip away at privacy interests and expand the surveillance society’s reach—and we’re about to see a lot more of it. Surden argues that privacy is safeguarded by barriers that make it hard for others to thwart our interest in limiting access to information. Bring down these walls—which Surden calls constraints—and prying eyes can capitalize on newfound vulnerability. Accordingly, we need to reassess how we think about our privacy rights, and what personal information should be included in that class.
There are several different kinds of constraints, and they are effective to varying degrees. The law, for instance, can be a powerful tool, giving explicit guidance and instilling relatively widespread compliance. What it can’t do is guarantee our information is 100 percent protected. Even if everyone knows that breaking into a house and stealing a diary is illegal, some people still will do it. Similarly, physical constraints also can prevent private information from falling into unwanted hands. As Surden notes, a tall fence can enforce property rights, keeping unwanted visitors, like the diary thief, away. But even electric fences don’t offer absolute protection.
Privacy also can be safeguarded by technology specifically intended to provide security. Let’s say a woman with a jealous boyfriend keeps an electronic diary encrypted by state-of-the-art algorithms. If her boyfriend doesn’t hang with the hackers, an online journal might offer be safer than a pen-and-paper notebook stored in a house. Still, this option isn’t absolutely foolproof. He might hire a code breaker from Craigslist.
Technology can protect privacy in another way, too. Sometimes technological constraints make certain behaviors too expensive to engage in on a regular basis and thus protect our interests by default. (Surden calls these “latent structural constraints.”) When the cost of accessing, aggregating, or analyzing data—say, analyzing a human genome—is extremely high, society can preserve privacy without needing to actually impose corresponding laws.
This sheds new light on what Zillow does. Surden’s theory suggests we should ask whether technological advances reduce transaction costs so much that previously rare, unwanted behavior (say, checking up on other people’s mortgage status) can become commonplace. If so, this change can deprive citizens of rights they once had. Zillow is correct that these rights weren’t sanctioned by legal authority. They were implicitly protected by technology—there was no need to enact legislation. If new laws get created to protect diminished privacy, then, according to Surden, they essentially preserve the existing rights of homeowners, rather than create new ones.
Emerging technology will continue to lower transaction costs, resulting in society confronting ever more eroded rights that initially were propped up by structural supports. One area that Surden is anxious about is future uses of radio frequency identification technology (RFID). They might reveal information that previously was so hard to come by, citizens assumed their right to it was protected.
Privacy concerns already have been raised over stores using “smart” RFID tags (electronic product codes that can identify items within a few feet of a scanner) to keep track of inventory and deter employee theft. Privacy advocates worry that if the tags are active when items get thrown away, “a criminal or marketer could scan your garbage” to see what you purchased. But it could be worse. As a thought experiment, let’s imagine that at some point in the future it becomes common practice for stores to embed smart RFID tags in most items. As Surden notes, doing so could speed up checkout time because it wouldn’t be necessary any longer to price items by hand scanning UPS barcodes with a laser. Instead, purchases could stay in carts, and fast lanes would be created, much like cars passing through RFID-enabled electronic toll systems like E-ZPass.
Let’s further imagine that as it becomes commonplace for people to own RFID readers. If this happens, the transaction costs for going through someone’s curbside trash decrease considerably. Under current technological constraints, dumpster-diving is not for the faint of heart or sensitive of nose. Trash can be smelly, messy, and time-consuming to sort through. Moreover, because garbage inspection is an unusual practice—typically reserved for law enforcement, private detectives, paparazzi, and recycling inspectors—it can easily draw unwanted attention. But RFID technology could allow the process to be done discreetly, quickly, and at a safe distance.
In that case, people who never considered rummaging through garbage before might give it try. Let’s say you’re a boss who is worried that an employee’s productivity has decreased because the person has become an alcoholic. Why not drive past his house and see just how many bottles he consumed that week? If you’re hell-bent on having a gotcha moment, you might even draw the wrong conclusions about what you find—not realizing, for example, that the empties belong to an employee’s college-age kids who were visiting home during a break.
Should things develop this way, several different scenarios could play out. After hearing about negative impacts, consumers could push back against retailers and demand they carry non-RFID stock. But given the time saved at the checkout, is that likely? Or perhaps people would become vigilant about removing RFID tags once they leave the store. But if the tags are ubiquitous and take time to remove, won’t slipups and procrastination undermine resolve? Industry might try to solve the problem by creating tags that can’t be read by home RFID readers. Of course, they might not want to invest in this pursuit, and customers could balk if they have to absorb some of the cost. Eventually, the government could be called upon to create new laws prohibiting electronic access to curbside garbage. The question is how much damage would accrue before this happens.
Ultimately, Surden’s theory is just an idea about privacy. Unless people and institutions are moved by it, it won’t constrain anything. Instead, we’ll look to outed mortgage defaulters, stigmatized disposers, and a host of other successors and shrug our shoulders at the collateral damage innovation and capitalism inevitably seem to create.
This article arises from Future Tense, a collaboration among Arizona State University, the New America Foundation, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.