Site icon Art21 Magazine

Drones: An Interview with Ingrid Burrington

Amazon PrimeAir, 2013. Advertising. © Amazon.

Amazon PrimeAir, 2013. Advertising. © Amazon

I’ve been following Ingrid Burrington’s work for a while. Burrington is an artist, designer, and writer but most importantly a deep thinker. Her work often examines hidden power structures and is heavily researched and beautifully mapped. Recently, Burrington has been investigating the Internet and surveillance technologies, especially in New York City. Burrington is a resident at Eyebeam, a center for arts and technology, and just wrote with Joanne McNeil a must-read piece on drones.

Ingrid Burrington. Installation of the Center for Missed Connections, 2009–2011. Burrington’s maps loneliness in cities, using the “Missed Connections” section of Craigslist as her primary data source. Courtesy the artist.

Given Burrington’s investigative approach, I thought she would be ideal for this series of speculative interviews I’m conducting for this issue, asking artists to consider the future of technologies found in their work. Relying on a deep understanding of the present is the best way to make predictions about our future. I first proposed that we talk about the future of surveillance, to which she responded via email, “The future of surveillance is kind of a weird one insofar as it seems like it’s already here, but maybe that’s the point.” (Unless otherwise noted, all of Burrington’s quotes are from our phone interview, May 1, 2014). Of course these technologies will continue to change, but what Burrington rightly points to is that over the past few years the presumed future of surveillance has become the present reality.

Considering this further, I suggested we narrow the conversation to focus on drones. While wiretapping, surveillance cameras, data collection, and general surveillance technologies have become dramatically more common in our post-9/11 world, they aren’t new. Drones represent a new shift in surveillance and war. While Burrington’s artwork doesn’t deal with drones, her research has, and she was excited to discuss the future of drones, writing to me that they are important as “a signifier of this moment in which we are letting mathematics take over decision making at increasingly larger magnitudes.”


Ben Valentine: In many ways, it feels like my imagined dystopian future of the surveillance state and its accompanying drones has already come to be. I want to believe that the work of people like Glenn Greenwald, the Center for Investigative Reporting, and James Bridle and so many more have been doing might slow or possibly reverse the exponential adoption of drones taking place all over the world for combat and surveillance, but I can’t say the evidence is on my side. Briefly paint for me what you see as two possible futures for drones, one being the best possible scenario and the other being the future you fear the most.

Ingrid Burrington: The best possible scenarios for drone technologies being used in the future center on the question of who owns them? It’s mainly proprietary technology, mostly in the hands of the military, if we are talking about the large, heavy-duty, and weaponized drones, while the smaller hobbyist and consumer-grade drones still are beyond the price point of the average consumer. The concept of using drones for good, while very well intentioned, still feels very much like it’s coming from this neoliberal, nonprofit, industrial-complex mentality, which weirds me out. So the potential for drones having positive social impacts has to do with drones becoming an available tool to those who could use them for establishing equitable power relations. The problem is that drones are tools that by default operate with asymmetrical power relations: the operator can see lots of things that you can’t see. So improving the scenario becomes about allowing people to see what drones see.

It took us hundreds of years to consider cartography a potentially emancipatory, bottom-up organizing tool. I hope that the timeline for doing the same with drones is not as long. It requires making the tools, the results, and the means to analyze the results more immediately available, accessible, and legible to people.

As for the dystopian, worst-case scenario, there are two kinds. One is the continued concentration of the technology’s ownership and the inability for non-operators to see what is being monitored. If you want to have access to the resources—the vision that drones can give you—you have to have a lot of capital.

The other kind of worst-case scenario is when we can’t identify that the drones are present. The hardware is getting smaller, and there is a ton of money being poured into making the hardware smaller. Also, they can fly to increasing distances. So this scenario is that the world continues to appear just as it is, yet people live with the understanding that it isn’t—which is, to a lesser extent, what already happens in places that are subject to drone strikes, like Pakistan and Yemen.

BV: I’m glad you note that this dystopian scenario is already occurring in certain places because I think we in the West forget that. I was shocked to learn that there was a surveillance plane circling over Compton in 2012, basically offering the Los Angeles Sheriff’s Department a real-time Google Earth view of the neighborhood. That this was done in secret with the help of a private firm, Persistence Surveillance Systems, has a lot of implications for the future of surveillance here in the US, a future that has already become commonplace in areas like Yemen, Pakistan, and Afghanistan.

IB: Right. A big part of people’s uneasiness with drones has to do with the lack of human presence and agency. If there are police helicopters flying over Compton all of the time, people don’t like that, but they know there is a pilot inside the helicopter and the organization behind it. When there is no announcement of surveillance or no person there at all, it is really unseemly.

James Bridle. Documentation of Drone Shadow 002 installation, 2012. Courtesy the artist. © James Bridle

BV: In Exposing the Invisible James Bridle talks about the many layers of the invisibility of drones. They can fly at a height at which they can monitor people while remaining invisible to the naked eye. Unlike most troop movements, drones are invisible in the media because they don’t have embedded reporters. Drones can operate very far from mainstream media outlets, potentially leading to very biased information about their presence. The majority of information we have about drone strikes or their use comes from the companies or governments using them, even though deep investigative work often proves those reports inaccurate. Bridle finally mentions the fact that Western soldiers never die in drone strikes, making them sort of politically invisible. If you were given a million dollars, how would you work to change the conversation around drones?

IB: One thing that could change the conversation is to find ways to use drones that interrogates their method of existing. This goes back to my discomfort about the drones-for-good universe, in which we continue to ignore or avoid even acknowledging the implications of the technology. So what would it mean to use a drone in a way that acknowledged that power and ambivalence, that simultaneous horror and wonder of the technology? The best example I’ve seen lately is the documentary Watermark, by Jennifer Baichwal and Edward Burtynsky, which is about water infrastructure. Using a drone, they had shots of massive things like dams being built in China and on the Colorado River, visuals that show in insane detail the incredibly clever things humans can build and the ways in which our cleverness can be so deeply destructive. People tend to build tools, including drones, to make hard decisions and hard tasks easier. Baichwal and Burtynsky basically used a drone to demonstrate that a lot of decisions just can’t be easy.

Something I heard at the drone panel at the “Theorizing the Web” conference that really stuck with me was the frustration that we don’t have the right words or metaphors for what the technology is or does. But I would advocate us to quit looking for them. People love coming up with tech metaphors, like the Internet is a highway, a city, a gathering place. Ultimately all of those get flipped; people want cities to be more like the Internet and not the other way around. The Internet has become a metaphor for itself. Saying that one thing is like another thing that has existed is so hard in the case of drones because, maybe except in science fiction, we haven’t dealt with anything quite like this before.

The language is messy right now; people both for and against drones are trying to rebrand them. Perhaps being deliberate about the language used to explain what the technology is and does would be useful in shifting the debate. The Campaign to Stop Killer Robots is a great choice for communicating this idea because it’s framing it in a very explicit way. It gets right to the point.

BV: I am really interested in how people are responding to or protesting the presence of drones on a personal level. I am noticing a proliferation of anti-surveillance tactics, especially for faces with Anonymous and its Guy Fawkes masks, or the Black Bloc tactic, or Adam Harvey’s CV Dazzle. I was thinking about what would happen if Harvey’s Privacy Gift Shop became commonplace and accessible fashion and not artwork. I’m interested to hear what you think the future of this trend will look like.

IB: Kate Crawford has been working on some great stuff about this, referencing the opposite of dazzle tactics, normcore. It is basically what Occupy Wall Street referred to as the “civilians” tactic: it’s blending in so you can do something you’re not supposed to do, like storm a bank lobby. But being able to blend in is a very privileged position; who is “everyone else” when you say you look like everyone else?

So yeah, normcore’s also this counterpoint to dazzle tactics and Anon masks. Adam Harvey is a very smart artist, and he’s walking this uneasy line—on the one hand, conveying the point about how privacy is a luxury in the age of mass identification and, on the other hand, making that problem very apparent by perpetuating it.

It’s really interesting, though, to see these seemingly polar opposite aesthetics become popular about the same time, for more or less the same reasons. I don’t think the future trends will look just like one or the other, but I think the trends that stick are going to be the ones that are accessible to lots of people. I don’t imagine an anti-surveillance aesthetic coming from a mass consumer source mainly because it’s entirely in the interest of those companies to have their consumers remain legible. Trends toward illegibility have to be decentralized by design.

Exit mobile version