Drones fly above us. Some of them look an awful lot like airplanes. They share space with airplanes and other aircraft. And, at least for now, drones are largely regulated like other aircraft. While drones have a particular set of flight rules, their regulation emerges from the history of aviation regulation and is the responsibility of the same regulators.
But occupying airspace is not what makes drones feel creepy.
And, their occupation of airspace should not be the sole focus for the legal system as it tries to understand and deal with drones. The law and the people it governs need to see drones as more than just like other aircraft.
Many of the stories we read about interpersonal drone conflicts underscore an unsettling feeling of drones flying overhead for surveillance purposes. The thought of unmanned aerial vehicles potentially watching us from above makes them feel like something of a mobile panopticon. Drones are, as University of Washington professor Ryan Calo once described them, “the cold, technological embodiment of observation.” But the mere fact that drones can fly is not what makes them feel creepy. If that were the case, the same social reaction could be expected from the use of police, news, or traffic helicopters, or from any planes or aircraft that could be similarly equipped with surveillance technologies.
Unlike manned aircraft, drones also lend a sense of anonymity (and a corresponding lack of accountability) to their operators. They do this by flying at a distance from their human operator, or without any human operator at all. Part of what was so unsettling when a drone crashed on the White House lawn or when mysterious drones were spotted flying over Paris was that no one knew who was flying them or why. (The White House incident later turned out to be just a case of drunk droning).
This separation between the operator and the drone also allows the technology to access space that a person might otherwise be unable to enter. This might mean overcoming physical barriers—circumventing them, while staying close to the restricted or dangerous terrain those barriers are meant to protect. It might mean overcoming security risks, where a person could not enter an area without a threat to their safety (e.g. military drones), or a fear of being identified (imagine the hummingbird and beetle drones in Eye in the Sky). Or it might mean overcoming social barriers—a person might not walk into a space because it would be awkward or uncomfortable, but he may still fly his drone there. While flight plays an important role in easily and quickly accessing these spaces, it is certainly not the only important characteristic supporting this technological ability to go somewhere unexpected. One could almost as easily imagine a snake- or ant-like drone gaining similar access and being perceived of as creepy in much the same way. And while it is possible to use design features or a registration system to reduce some of the anonymity of drones, these responses aren’t a complete solution. And they do not address the ways in which drones overcome boundaries and challenge our expectations.
The airborne nature of drones does have implications for air safety. It is important to ensure that drones don’t crash into each other, buildings, or other aircraft, and that they don’t fall on our heads. But given that drones have implications beyond just the space they occupy, is it time to re-examine how we, and regulators, conceptualize them? Could a broader perspective help us better understand and regulate drones so the technology can move beyond the “ick” factor?
Society, courts, and legislators often understand new technologies through analogy to familiar technologies. The first cars were thought of as horseless carriages, the telegraph (and, later, email) was analogized to a physical letter, and the internet has been the source of much debate about the appropriate use of metaphors in regulation and policymaking. Metaphor and analogy are natural ways to understand and relate to the world around us.
But when it comes to applying an existing legal doctrine to a new technology, choosing the right or wrong metaphor can have a significant effect on the use of that technology. “By emphasizing one aspect of a concept, a metaphor might blind us to other aspects that are inconsistent with the metaphor,” says professor Stephanie Gore. An analogy between an old and a new technology can cause someone to focus on the common features between the two technologies, at the cost of overlooking their (sometimes significantly) different functions.
Depending on your definition of a robot, and which type of drone you’re looking at, many drones (especially those projected for use in the foreseeable future) qualify as robots. And robots pose a particular risk for using the wrong metaphor, because the laws relating to existing technologies do not easily translate to machines that operate with any degree of autonomy. Washington University in St. Louis law professor Neil Richards and Oregon State University roboticist William Smart have argued, “if we get the metaphors wrong for robots … it could have potentially disastrous consequences.” This is why Richards and Smart caution against designing legislation “based on the form of a robot, not the function.” While one of their main concerns is with lawmakers and the public treating anthropomorphic robots too much like intelligent mechanical humans, the “form over function” issue is also relevant to how the law treats drones, a question the courts will soon be asked to deliberate.
Consider the case of Kentucky’s William Meredith, who was criminally charged for shooting down a drone that flew over his yard. The charges were ultimately dropped. The District Court judge found that Meredith was just preventing what he perceived as an invasion of his privacy. The drone operator, David Boggs, subsequently filed a federal complaint for declaratory judgment, asking the court to clarify the law as it applies to drones. In particular, Boggs asked the court to draw an analogy between drones and the planes and helicopters that can currently be used to observe private property from above.
The U.S. Supreme Court has said that viewing the area surrounding a home from a manned aircraft flying in public airspace does not invade any reasonable expectation of privacy. If the analogy between drone and plane is adopted, then the same could be said for drone surveillance (undermining some of Meredith’s justification for shooting Boggs’ drone).
It’s tempting to treat drones as we treat manned aircraft—they share space and similar form. Yet this analogy risks obscuring important nuances. I’ve already mentioned how drones can operate anonymously and enter spaces where entry by a person (or manned aircraft) might otherwise be unexpected. But add to this the fact that drones are relatively cheap, and widely available. Drones can be hacked. They pose risks for data collection, retention, and protection. (Imagine drones equipped with license plate readers, which can collect and process information about a driver’s law abidingness, as well as her physical location, and even identity. This raises different privacy concerns depending on who controls the drone, why that information is collected, and how it is protected and stored.) And they are becoming far more autonomous than manned aircraft. When it comes to the legal and social treatment of drone technology, there is a lot more to consider than how they operate in airspace. If we take a step back from the obvious analogy, we may start thinking more broadly about how to, and who should, regulate. (For more on who can and should regulate, watch the Drones as Disruption conference video stream, which kicked off with this very question of what “is” a drone?) And by prompting more comprehensive laws addressing privacy and other concerns raised by this technology, drones might start to feel less creepy.
What are the right legal analogues for drones? It may be that different legal categories are useful, depending on the context. Treating drones like aircraft might continue to be appropriate in the air safety setting for some time, but computers might be a better analogue when we think about data protection, flying cameras in a privacy setting, or delivery trucks in future contractual disputes. And, remembering that many drones are robots will be important when we think about broad policy issues. Ultimately, we should be careful not to get too settled into one metaphor, especially one that risks obscuring so much of what drones really do. This is particularly true given that this technology is just getting started and we don’t yet know what its future will hold.
This article is part of the drones installment of Futurography, a series in which Future Tense introduces readers to the technologies that will define tomorrow. Each month from January through June 2016, we’ll choose a new technology and break it down. Read more from Futurography on drones:
- “Do Drones Have to Be Creepy?”
- “Your Cheat-Sheet Guide to the Key Players and Debates for Drones”
- “The Rise of Nonviolent Drones”
- “The Six Biggest Misconceptions About Drones”
- “What Can Consumer Drones Actually See?”
- “Drone Privacy Is About Much More Than Protecting Sunbathing Teenagers”