Вы находитесь на странице: 1из 6

Subscribe

FILLS UP A ROOM WITH EASE

Enough With the “Sunbathing Teenager” Gambit


Drone privacy is about much more than protecting girls in bikinis.
By MARGOT E. KAMINSKI
MAY 17, 2016 • 9:00 AM

Photo illustration by Sofya Levina. Images by Wavebreakmedia Ltd/Thinkstock and agnormark/Thinkstock.

Last July, a Kentucky father spotted a drone hovering over his backyard, where his two
daughters were purportedly sunbathing. He took out his shotgun and shot the drone down.
Later he ruminated that “[w]e don’t know if they’re pedophiles looking for kids, we don’t
know if they’re thieves. We don’t know if it’s ISIS.”

Drones embody surveillance. They provide a visual and sometimes physical target for
privacy fears. Drones have catalyzed state privacy lawmaking and prompted numerous
conversations about coming privacy concerns. Intriguingly, however, the driving drone
privacy narrative hasn’t been about location tracking, or pervasive government surveillance.
It’s been about sunbathing young women.

The Kentucky father protecting his daughters is in good company. A New Jersey man also
shot down a drone to protect his family’s privacy. A woman in Virginia Beach, Virginia, told a
drone operator that hovering over sunbathers on a private beach was “creepy.” A
Connecticut woman assaulted a drone hobbyist for taking pictures of people on the public
beach with his “helicopter plane.” A California man threw his T-shirt over a drone on a public
beach, explaining that “[w]e had like a peeping Tom.” A Florida resident alleged that drones
had been spying on her sunbathing teenage neighbor: “They’re recording children in bathing
suits or they’re recording the teenager across the street, who lays out in her front yard in
her bikini.” (By the way, that narrative about the Kentucky father protecting his daughters
has been challenged since his arrest. He won in state court but a federal suit iled by the
drone owner is pending.)

The sunbather narrative has made its way to the United Kingdom, where a Bristol woman
quickly covered up after seeing a drone overhead. And it has made its way into academic
and policy work. Drone expert Gregory McNeal talks about the sunbathing woman in his
Brookings Institution report “Drones and Aerial Surveillance,” writing, “While the police are
overhead photographing 123 Main Street, they look down and see a woman sunbathing in
the adjacent property at 125 Main Street. …” Arizona State University law professor Troy
Rule, in proposing a localized zoning system for drone use of airspace, discusses how
“[i]ndividuals lying camera- itted drones above residential neighborhoods have disturbed
sunbathers in their private yards.” (ASU is a partner with Slate and New America in Future
Tense.)

With all we know about the complexities of information privacy, why is the female
sunbather the story that keeps capturing attention?

Maybe it’s because the sunbather narrative is easy; it’s concrete. A woman or girl who
otherwise wouldn’t expose herself in a bikini suddenly has a much wider audience than
intended. Maybe it’s because the sunbather narrative is actually happening at a greater
frequency than other privacy issues; people are perverts, and prurience is a great motivator.
Or maybe the sunbather narrative is just the latest spin on the old, old tale of Lady Godiva:
Peeping Tom takes a look at the nude woman and is consequently struck blind or dead.

The story of Lady Godiva is a myth illed with fascinating gender dynamics. Most scholars
believe that the ride didn’t actually happen. According to legend, Lady Godiva pleaded with
her husband, Count Leofric, to lower taxes. He told her he would do so the day she rode nude
through town at noon. When in protest she did just that, the people of the town of Coventry
stayed indoors out of respect, to preserve her modesty. But Peeping Tom ignored the social
contract, gazed at her out of lust, and was punished. According to professor Daniel
Donoghue, who has tracked the development of the Godiva myth, the story evolved over
time to include Peeping Tom. Donoghue explains, “Tom would become the scapegoat and
bear the symbolic guilt for people’s desire to look at this naked woman.” Peeping Tom
became a point of resolution for con licting impulses over freedom, control, and lust.

The sunbather disrupted by drones is a Lady Godiva story, of sorts, without the tax policy. A
young woman expresses liberation by wearing a bikini in her backyard or on the beach.
Everyone generally follows social norms and refrains from staring for too long, or taking
photos or video. But the hovering drone breaks that agreement and must be punished, just
like Tom. Often it’s dad who does the punishing, but sometimes it’s just a Good Samaritan.
Law isn’t very helpful. Existing state Peeping Tom laws mostly do not cover these incidents,
because many require trespassing or peeping in through the windows of an actual house.

The problem with letting the sunbather narrative dominate drone privacy coverage is that it
provides a woefully incomplete account of the kinds of privacy concerns that drones raise. If
we legislate to protect the modesty of sunbathers, we risk letting signi icant issues fall by
the wayside. That’s leaving aside questions of whether privacy and modesty are equivalent
(they’re not), and whether the father-daughter dynamic that results in a shot-down drone
is a healthy one (take a guess).

The sunbather story fails us because it ignores issues of information privacy. Drones will
collect enormous amounts of information and absent federal omnibus data privacy law,
which we don’t have in the United States, there is next to nothing to govern that data’s
processing or use. This includes combining data from one drone with data from other
devices, to create a near-complete portrait of somebody’s physical interactions. Retailers
and insurance companies, just as examples, could certainly be motivated to create these
kinds of data portraits of people. (As of publication, insurance companies had received 276
special permissions from the Federal Aviation Administration to use drones.) We are
already pro iled online by data brokers; companies have every incentive to try to extend
that pro iling to physical space. And they don’t want to have to ask for permission to get it.

Our current federal privacy regime, depending on enforcement actions by the Federal Trade
Commission, is premised on protecting consumers from broken promises and unfair actions
by the companies with which they transact. The problem with drones and other new
technologies is that a person who gets tracked by a drone usually won’t be the drone’s
owner. He or she thus won’t have the consumer relationship with the drone company that
triggers FTC protection. The FTC is ill-equipped to govern this, in the same way it is ill-
equipped to govern the “Internet of Other Peoples’ Things.”
The sunbather narrative fails us in other respects as well. For instance, it doesn’t address
facial recognition technology. Our inescapable biometric identi iers mean we can lose the
practical obscurity in which we usually operate in physical spaces. People out there in public
might not recognize or identify us—but drones will. This allows those in possession of drone
video to much more readily pro ile particular individuals. The sunbather story also doesn’t
address that many times, drones will be gathering information using superhuman senses,
like thermal imaging, that we aren’t accustomed to acknowledging and can’t practically
shield ourselves from. And the sunbather narrative fails to capture cybersecurity problems.
If you think drones are disruptive now, just wait until they’re hacked.

However, the sunbather narrative isn’t completely wrong. It resonates precisely because
drones, like an array of other new technologies, sit at the intersection of spatial and
information privacy. The sunbather story illustrates a spatial privacy problem: Once, fathers
thought their daughters were protected by the six-foot privacy fence. (The daughters
themselves may or may not have cared.) Now, drones make that fence irrelevant. Physical
architecture once constrained people from seeing into others’ backyards or upstairs
windows; now drones, like thermal-imaging technology, can discern information that was
otherwise obtained only at great cost. The question is whether or when the law should
intervene to impose legal costs where the physical and inancial constraints have fallen.
Law can enable us to continue to manage our privacy using features of the real world that
we’ve grown up with—and grown dependent on.

As social actors, we regularly use cues from our physical and social environments to decide
how much we want to disclose in a particular setting. Technologies like drones disrupt
environments. They take down walls. They distance the human operator from the
enforcement of social norms in a particular setting (like the beach). They disrupt our ability
to calculate how much we’ve disclosed by potentially tracking behavior over time, at far
lower cost than a helicopter.

The Supreme Court is starting to understand these things, although it took some time. In
2001, the court found police use of a thermal-imaging device violated the Fourth
Amendment because it “might disclose, for example, at what hour each night the lady of the
house takes her daily sauna and bath.” (There’s Godiva again.) The court hinged its decision
about new information technology and spatial privacy on whether the new technology had
been widely and publicly adopted. This reasoning raised a host of concerns about a
downward ratchet in privacy law; would we lose Fourth Amendment protection just by
widely adopting new technology?

By 2012, however, ive justices understood that using GPS technology to persistently track
somebody’s location over nearly a month, even in public places, and even though GPS
technology is certainly in widespread public use, could violate a reasonable expectation of
privacy and thus the Fourth Amendment. These justices recognized that we calibrate
behavior based on the assumption that it’s just too hard or too expensive for someone to
follow us that consistently, over that amount of time. And persistent tracking over time and
space can disclose sensitive information, such as religious beliefs, or sexual or political
preferences.

The Fourth Amendment applies to law enforcement, not to private actors. But current
developments provide better ways of thinking about data-gathering technologies such as
drones. Like GPS, drones make it cheaper and easier for creepy neighbors to follow
someone over an extended period of time. Like thermal imaging (and sometimes using
thermal imaging), they make the physical barriers that we rely on less e ective. Drones
pose a hybrid of information and spatial privacy problems. That hybrid of issues is
increasingly the problem of this age.

Currently, there’s no national regulatory regime in place to handle drone privacy. (There are
a number of state privacy laws, but most states have no privacy laws that would cover the
sunbather, or persistent tracking by drone, or drone data use.) The Federal Aviation
Administration has said it wants to stay out of privacy issues; a court just this month
refused to weigh in to compel the FAA to address privacy before its inal rules have come
down. Last year, the president instructed the National Telecommunications and
Information Administration to host the development of industry best practices for drone
use. That process is ongoing, but many public interest groups have chosen not to participate
in it after the failure of the NTIA’s best practices for facial recognition.

In April, the Senate approved a bill reauthorizing the Federal Aviation Administration. The
bill would, among many other things, suggest (but not require) that drone operators “for
compensation or for hire, or in the furtherance of a business enterprise” create a privacy
policy enforceable by the FTC. The bill provides no substantive requirements for those
policies, allowing companies to set their own low standards—or not set them at all. Absent
consumer relationships with those tracked by drones, it’s unclear what would motivate
companies to put good privacy policies in place. The bill also instructs the NTIA to submit a
report to Congress on industry best practices that may serve as the basis of federal
legislation. Again, given the weaknesses of the NTIA process (in which I’ve been involved),
this is not a good source of substantive privacy recommendations. The Senate bill pre-
empts drone-speci ic state laws, which would foreclose local experimentation with drone
policy, including privacy. This is ill-advised. It remains to be seen what will happen in the
House, but its controversial Aviation, Innovation, Reform, and Reauthorization Act similarly
leans heavily on the NTIA.

Drones have many, many positive uses, from safety inspections to environmental research
to monitoring police behavior. But when we’re discussing the privacy problems they raise,
it’s about time we got away from the bikinis.

This article is part of the drones installment of Futurography, a series in which Future Tense
introduces readers to the technologies that will de ine tomorrow. Each month from January
through June 2016, we’ll choose a new technology and break it down. Read more from
Futurography on drones:

“Do Drones Have to Be Creepy?”


“Your Cheat-Sheet Guide to the Key Players and Debates for Drones”
“The Rise of Nonviolent Drones”
“The Six Biggest Misconceptions About Drones”
“What Can Consumer Drones Actually See?”

Future Tense is a collaboration among Arizona State University, New America, and Slate. To


get the latest from Futurography in your inbox, sign up for the weekly Future Tense
newsletter.

Slate is published by The Slate Group, a Graham Holdings Company.


All contents © 2020 The Slate Group LLC. All rights reserved.

Вам также может понравиться