Site icon Art21 Magazine

Automating the Pain of Others: Susan Sontag and Facebook

Facebook screenshot. Courtesy of the author.

Susan Sontag passed away the same year Facebook came online: 2004. Her death closely followed the 2003 publication of her final book, Regarding the Pain of Others, in which she discussed how images of suffering and death function in contemporary culture—forecasting the interchangeability of calm and violence that Facebook and the internet at large would come to facilitate. She declared that “there isn’t going to be an ecology of images” visible for public consumption, implying that instead a messy glut would continue to proliferate. Facebook has accommodated a socially significant portion of this glut. Users upload hundreds of millions of photographs every day, all displayed in more or less the same way, regardless of content. Facebook’s blue block accents on fields of white, its “humanist” sans-serifs, its in-line emojis—these design features culminate in an atmosphere that is clean but not quite sleek, more cheerful than dispassionate. This cheer is appropriate for vacation photos, status jokes, and friendship selfies—perhaps less so for graphic images of violence and death.

This cheer is appropriate for vacation photos, status jokes, and friendship selfies—perhaps less so for graphic images of violence and death

Within a few years of Pain’s publication, a host of dynastic online platforms appeared alongside Facebook, creating even more space for the ever-growing surfeit of violent images Sontag predicted. These platforms included YouTube (2005) and Twitter (2006), as well as less mainstream platforms like LiveLeak (2006), a website that specializes in user-shared videos of violence and violent death. In Jennifer Malkowski’s recent book, Dying in Full Detail, she discusses the relationship between websites that specialize in user-uploaded death videos, and the pornography often advertised in their side bars, “suggesting a profitable crossover between visitors to death porn sites and visitors to sexually pornographic sites.” This crossover powerfully bolsters Sontag’s observation that “most depictions of tormented, mutilated bodies do arouse a prurient interest.”

These sites’ prurient voyeurism is an extreme illustration of Sontag’s larger observation that “photographs objectify: they turn an event or person into something that can be possessed.” While sites like LiveLeak are comparatively niche, Facebook, Twitter, and YouTube also host user-uploaded videos of violence. Sometimes these videos are reported on Facebook, and reported videos might be subsequently removed, or else covered with a protective blur screen. Unreported videos remain online and fully visible. Some violent videos are never reported, and even reported videos can easily be uploaded again by other users. Many (likely most) viewers are upset and disturbed by these images, but some inevitably experience them as the “death porn” that Malkowski describes. When violence is executed by the state without consequence, its causal representation can become another iteration—even a powerful assertion—of the state’s impunity.

“photographs objectify: they turn an event or person into something that can be possessed”

The disproportionate use of fatal police force against Black people in the United States has been documented again and again through digital video over the last eight years. These videos may have some power to provide enduring witness, and to prevent further violence. Facebook displays these videos, however, in a way that is literally small, generic, and via repetition, disposable. In many cases, videos on Facebook begin playing when you roll over them, regardless of whether or not you click; this means, for example, that in 2014 I saw the first few seconds of Eric Garner’s murder over and over again, sometimes as small as my thumb and always without my consent. In a 2015 update, Facebook’s algorithm began tracking not just clicks and likes, but linger-patterns. I might linger an extra second on a video post because I find it disturbing. Even though I choose not to click for the same reason, my reflexive lingering means I’m likely to be shown that post again. Another person who scrolls by the same video quickly might never see it again. Or they might never see it in the first place, if it didn’t match their algorithmically determined interests. Today, anyone can respond to any video with hearts and smileys, regardless of its content.

Screenshot of a Facebook search. Courtesy of the author.

In my Facebook feed, recorded death is made handheld, and handheld death is disproportionately Black. The violent videos that have displayed in my personal Facebook feed depict Black people almost exclusively, and this again reflects a history of disproportionate violence, as well as Facebook’s algorithmic curation. Philando Castile and Alton Sterling were killed just one day apart last summer. Both deaths were recorded on video; the aftermath of Castile’s shooting was live-streamed on Facebook as his girlfriend filmed it. This video has been uploaded many times; and only in the past couple weeks have I noticed a warning screen on some of the uploads. Roxane Gay wrote in The New York Times that watching the video of Alton Sterling’s killing made her “complicit in the spectacle of black death.” Artist Dread Scott’s response to Philando Castile’s murder succinctly illustrates the legacy of this spectacle; he recreated a protest flag from the twenties, but added two words to it: “A Man Was Lynched by Police Yesterday.” In a sense, Facebook makes this lynching pocket-sized. Philando Castile’s killer was acquitted last month, and acquittal and non-indictment are the norm in these types of killings, whether or not there is video evidence. Casual images of Black death result from and reiterate a system that selectively permits murder without consequence.

in 2014 I saw the first few seconds of Eric Garner’s murder over and over again, sometimes as small as my thumb and always without my consent

The formal and curatorial “choices” made by algorithms on Facebook are powerful; they affect the content consumption of a billion-plus users every day. The exact nature of these algorithms change frequently, and so their impact is a moving target. The excess of content posted to these sites will not likely slow down, and some of it will, inevitably, be violent. The pragmatic choice to moderate only reported posts precludes a potentially pernicious level of censorship (more pervasive than the algorithm’s slippery auto-pandering). Facebook’s Community Standards aim to protect users from upsetting content while also permitting images that “raise awareness” or serve another non-sadistic purpose. Because videos are only moderated upon report, they might remain fully visible for a long time. According to a Facebook spokesperson, unmoderated copies continue to pop up because other users can use special tools to download and re-upload them.

The favoring of immediacy over filtration in terms of what is displayed online has little to do with the way it is displayed. Sontag notes that “one can feel obliged to look at photographs that record great cruelties and crimes,” adding that “one should feel obliged to think about what it means to look at them” (my emphasis). Today, I would add that we should think about what it means to display images of crimes so casually; how the means of display changes the answer to Sontag’s question, of the images’ capacity to “assimilate what they show.”

Exit mobile version