Imagine sitting in your backyard and seeing a drone fly overhead. It hovers. The camera mounted underneath adjusts and seems to be looking at you. As quickly as it arrived, it flies away and disappears. You are left wondering who was operating it and for what purpose — and what you can do to prevent it from happening again. This is a story that has arisen, time and time again in recent years in Canada.

On a number of occasions, similar encounters with unknown drones have led to visceral, and sometimes violent, reactions from the person feeling observed. A now infamous example from the United States involved William Merideth, the self-proclaimed “drone slayer,” who shot down a drone that flew over his residential Kentucky property where his young children were playing. A Kentucky court dismissed charges of first-degree endangerment and criminal mischief against Merideth, saying that he had a right to shoot down the drone to protect his privacy.

Drones have been shot down here in Canada as well.

Not long before Merideth’s encounter, a young woman in Connecticut had a similar visceral, defensive reaction to a drone that was being flown over sunbathers on a public beach, when she physically assaulted the drone operator. She was subsequently criminally prosecuted. The drone operator was told he could continue what he was doing, despite the strong reaction it inspired, because he was operating in a public place.

Drone technology is one of the few popular robotic technologies that are already widely available on the market. It is likely be one of the first robotic technologies to gain widespread public deployment. It is also the only robotic technology that has been federally regulated in Canada. While personal-drone technology today requires a human to exercise some input and oversight to ensure safe operation, the autonomous capabilities of the technology will only expand as its underlying machine learning software becomes increasingly sophisticated.

Drone technology may ultimately serve as a test case for how we as a society address some of the social tensions that arise when increasingly autonomous technologies make new types of encounters possible.

Drone technology may ultimately serve as a test case for how we as a society address some of the social tensions that arise when increasingly autonomous technologies make new types of encounters possible. For this reason, it is imperative that we develop thoughtful legal and policy responses to this technology early on.

Personal, or recreational, drones are those used by individuals for noncommercial purposes. We certainly need to be cognizant of the privacy issues raised by wide-scale commercial and government uses of drones, for a myriad of reasons. However, the privacy impacts of personal drones are also important for several reasons, not least of which is their growing popularity. The US Federal Aviation Administration estimates that there were 1.9 million personal drones in the US in 2016, and expects that number to grow to 4.3 million by 2020. These numbers suggest that personal drones are nearly twice as popular as commercially operated drones in the US. The impact of personal drones on individuals in the spaces where they are flown cannot be overlooked.

Drones and privacy

One of the features of drone technology that raises new and difficult challenges is that it operates remotely – and increasingly autonomously – from the human pilot. Additionally, an aerial drone can move freely through space (unlike surveillance cameras fixed to a building or an individual with a handheld camera) and can enter otherwise hard-to-access spaces, like backyards and apartment balconies.

These aerial and remote capabilities can sever the connections we ordinarily expect to have with the people we encounter in public space, undermining our ability to assess context, to measure trust, and to determine our best recourse to an encounter. Drones do this in at least two ways.

First, drone technology undermines transparency. Because the drone can operate at a distance from its pilot, it is not always immediately (or ever) clear who is operating it or why. Second, drone technology undermines accountability. In addition to the operator’s anonymity, it may be unclear what an individual can do to respond to a drone that follows her or hovers above or near her; she cannot communicate with the drone – as she could with a person – and may not have the ability to simply walk away from the encounter. These concerns will only grow deeper as the AI-driven automation of drone technology continues to improve.

Of particular significance is the way in which drones reduce the available limits on harassment — especially by allowing the operator to remain anonymous or discreet, and potentially to avoid confrontation. Public harassment, stalking, and surveillance already disproportionately affect some groups more than others, including women and women of colour in particular, with little effective legal remedy.

It is perhaps no surprise that drones elicit a variety of visceral reactions, given the loss of recourse individuals may experience in social interactions that are mediated by drones.

Drone regulation

Part of the privacy problem arising from personal drones in particular is that interpersonal privacy regulation in Canada is piecemeal and varies from province to province. One of the primary ways in which individuals can assert their right to privacy against other private individuals is through tort law. However, tort protections in the Anglo-American common law legal system do not clearly protect individuals in public spaces.

While courts have not said that tort protection can never exist in public, previous cases have suggested that when in public, one may not have any privacy protection. Generally speaking, outside of Quebec, the courts have implicitly (or explicitly) relied on a notion of “privacy as secrecy,” under which only the things we keep secret or secluded deserve privacy protection. This theory, however, fails to address most of the concerns raised by drone technology, which generally engages with individuals in public or publicly visible spaces. In public, our privacy interests are guided by our relationships, our trust of others, social norms, and context – not by secrecy. These are the very protections that drone technology can easily overcome.

While personal drones are coming under greater regulatory oversight in Canada, the focus of the regulations is predominantly on safety. New rules for recreational drone users limit flights over people and property. Transport Canada (Canada’s drone regulator) has also proposed new regulations that would bring further oversight to drone flights, still exclusively focused on safety. While this is a logical focus for a transportation safety agency, it leaves a gap in the legal oversight of the social implications of this increasingly popular technology.

Solutions

Law can help guide the trajectory of innovation by, among other things, fostering public acceptance of a new technology by tackling some of its negative social impacts. Two ways in which to approach the privacy issues raised by drones are through drone-specific regulations and through technology-neutral changes to privacy law.

First, drone-specific regulations can help to address transparency and accountability. The proposed new drone rules incorporate some mechanisms for doing this, including drone registration and licensing requirements. However, it is also time for drone regulations to move beyond an exclusive focus on protecting physical safety, to address the safety and security that we expect by having some control over our personal spaces and personal information. It is crucial that we address the social impact of this technology, particularly because more sophisticated forms of AI-enabled robots are on the horizon.

While Transport Canada’s rule-making process allows for public comment on proposed safety rules, it is also time for broader public dialogue about the social issues raised by drones. This discussion can be further supported through empirical social research on the beneficial and detrimental impacts of drone technology on individuals and communities. Additionally, drone regulations could set out guidelines for privacy training for both commercial and private operators. However, this training will not sufficiently address some of the privacy concerns raised by the technology until our privacy law is clear about how individuals are protected in public space.

Courts and lawmakers can also encourage greater recognition of the nuanced ways in which individuals expect privacy in public places. The Supreme Court of Canada has already laid the groundwork for a stronger recognition of privacy interests in public spaces. For instance, the Court has held that in certain contexts where individuals share information with one person they can still expect some privacy in that information relative to other persons; individuals who anonymously carry out publicly visible activities expect some privacy from their anonymity; and individuals moving through public space do not expect to be monitored for long periods of time. The Court has also recently affirmed that privacy protections between private individuals can merit broad and generous interpretation.

Guidance from lawmakers would also help. For example, commercial entities in Canada already have legislative guidance on informational privacy, and Europeans will soon be protected by the General Data Protection Regulation (GDPR), which includes regulation of privacy interests between private citizens.

The use of drones by powerful government agencies and large companies like Amazon certainly raises a quantitatively greater risk to the privacy of private individuals. And yet, in some cases, the intrusions by our neighbours, lovers, friends and enemies, which are made increasingly possible by robotic and AI-enabled technologies, may feel qualitatively worse. It would be a mistake to overlook the social impacts of personal drone technology in favour of focusing exclusively on physical safety. To do so places our privacy in public, and in publicly visible spaces, at risk

This article is part of the Ethical and Social Dimensions of AI special feature.

Photo: Shutterstock/By marekuliasz


Do you have something to say about the article you just read? Be part of the Policy Options discussion, and send in your own submission. Here is a link on how to do it. | Souhaitez-vous réagir à cet article ? Joignez-vous aux débats d’Options politiques et soumettez-nous votre texte en suivant ces directives.

Kristen Thomasen
Kristen Thomasen is an assistant professor of law, robotics and society at the University of Windsor, and is currently completing her PhD in law on the topic of drones and privacy in public at the University of Ottawa.

Vous pouvez reproduire cet article d’Options politiques en ligne ou dans un périodique imprimé, sous licence Creative Commons Attribution.

Creative Commons License