Are online users ethically responsible for the memes that they view and circulate? What if those memes have their origins in a leaked forensic image of a dead body? What if that same meme is used to terrorize the victim’s family? “Porsche Girl,” the morbid meme based on ...
Are online users ethically responsible for the memes that they view and circulate? What if those memes have their origins in a leaked forensic image of a dead body? What if that same meme is used to terrorize the victim’s family? “Porsche Girl,” the morbid meme based on the photographs of a teenage girl following a deadly car crash, demonstrates the vulnerability of “memeified” subjects as well as the ethical risks involved when sensitive content is disseminated online. Using the model of distributed responsibility, this case study suggests that viewers are in fact ethically responsible (at least in part) for the online content with which they engage, but that platform regulations and overall legislation also play a crucial role. By perpetuating the circulation of a sensitive image, the harmful effects of the image persist.
Keywords: digital death, bodies, memes, online abuse, Nikki Catsouras
Author disclosure(s): Portions of this case study are excerpted from chapter 2 of N. de Vries, Digital Corpses: Creation, Appropriation, and Reappropriation (University of Amsterdam, 2020).
Identify the complexity of ownership over individual images once they are uploaded or circulated online
Describe how the meaning of a given image changes once it is memeified in an online community, due to a diminished sense of individual responsibility (“distributed responsibility”)
Describe the shift in agency that takes place when the image of a deceased person becomes a meme, both with regards to the person depicted in the image and for that person’s surviving family (the human corpse becomes an “unwilling avatar”)
Identify the sociopolitical factors that play a role in the “memefiability” of a human corpse (such as gender, race, and age)
Identify legal factors that affect the degree to which images of a human corpse can be reproduced online (such as the right to be forgotten)
Over the past twenty years, it has become commonplace for Internet users to have access to representations of other people’s bodies. With the rise of social media platforms such as Instagram and Snapchat, public intimacy has gradually become normalized so that pictures of private moments, ranging from weddings and hospital visits to banalities such as breakfast, are now standardized elements of social media’s visual jargon. As with any social shift that involves a (relatively) new medium, the act of online image-sharing poses new ethical challenges—challenges that, while not altogether absent in the offline world, are intensified in online spaces due to the massive scale and speed that characterize the Internet and its communities.
The main ethical challenges on which I focus in this case study concern agency, responsibility, and ownership. Users who share pictures of themselves online, or who have their pictures shared by others, lose a degree of control over these images due to the large-scale shareability and malleability—and therefore vulnerability—of online content. These factors have contributed to Internet-specific trends such as “memes,” which are popular exactly because they can easily be shared and altered. The same factors have accelerated new forms of abuse, such as “doxing” (the act of publishing an individual’s private details online) and “revenge porn,” in which intimate images and videos created by an (often) ex-lover are shared online as a way of objectifying and ridiculing a person.
In this case study I analyze an example that combines elements from both of these forms of abuse, in that it both impinges on a subject’s privacy and ridicules the subject’s resulting vulnerability. A key detail of this case study, however, is that the image in question, though harmful, was not originally uploaded with the intention of abuse. What this case study thus demonstrates is that, no matter the intention behind an online image (be it education, activism, advertisement, or some other goal), the meaning of the image is determined through the way in which it is reproduced and shared. The case study in question concerns Nikki Catsouras, an eighteen-year-old woman from California who died in a car crash in 2006, and whose police case file was leaked online, enabling images of her dead body to become public-shock fodder.
On October 31, 2006, Nikki Catsouras, the eighteen-year-old daughter of real-estate broker Christos Catsouras and novelist Lesli Catsouras, snuck out of the house to go on a joy ride in one of her (wealthy) father’s luxury cars. Shortly after leaving the house, Catsouras crashed the car into a tollbooth and died instantly upon impact. The aftermath of the accident was so gruesome that “the coroner wouldn’t allow [the] parents to identify their daughter’s body.”1 The Catsourases were to be spared the horror of seeing their daughter’s maimed corpse—but only briefly.
Criminal investigators photographed the crash site for forensic inspection, and two California Highway Patrol employees gained access to the images. The two men proceeded to share these images with friends via email “as reminders of the dangers of reckless driving,” they later claimed in court.2 The images then quickly spread to a number of websites, including gore blogs,3 body horror forums, and even pornography sites, where they became known under the meme moniker “Porsche Girl.”4
Online trolls reached out to the Catsouras family to send them the horrible images as a cruel joke, often including galling captions such as “Hey daddy, I’m still alive.”5 This form of “RIP trolling” caused significant additional trauma for the Catsouras family.6 Out of fear of seeing the images again, the Catsourases have chosen to remain offline ever since, or so they told filmmaker Werner Herzog in his documentary Lo and Behold, Reveries of the Connected World (2016), which featured the Catsouras case in one of its segments.7 The family’s reluctance to use the Internet has not been beyond reason: as of 2022, images of Nikki Catsouras’s dead body are still available online, and can be easily found with common search engines.
The Catsouras case is a significant (and distressing) example of the carnivalesque way in which online images can become detached from their offline context, often leading to harmful effects for real people.8 In what follows, I will outline the ethical risks associated with memeification of sensitive content in terms of agency, responsibility, and ownership. Using the concepts of the “unwilling avatar” as coined by Mary Anne Franks (2011) and “distributed responsibility” as described by Charles Ess (2009), I will further discuss how sociopolitical factors such as gender, age, and race influence the conditions under which an image is, or can be, memeified. Lastly, I will demonstrate how the memeification of dead human bodies reveals practical challenges to recent legal developments such as the “right to be forgotten” (RTBF), which has been implemented in European (but not American) jurisdictions as of 2014. As objects in an international, intangible space, online images—be they of people alive or dead—are subject to a variety of national jurisdictions, most of which have not implemented any version of the RTBF. As a result, sensitive images of individuals can be particularly difficult to remove from the Internet, since there exists no global consensus about content moderation or user protection.
The Catsouras case represents a striking example of bodily agency—and its loss—in an online space, though it is hardly unique. Consider, for example, of the shocking image of Alan Kurdi, the three-year-old Syrian boy who drowned while his family attempted to flee to Europe in late 2015. On September 2 of that year, the child’s body washed ashore on a beach near Bodrum, Turkey, where it was photographed by Turkish press photographer Nilüfer Demir. Pictures of the dead child’s body were then widely circulated in Western media, thereby alerting international readers to the reality and severity of the European refugee crisis. Even more recently, there is the case of George Floyd, the African American former security guard who was killed by white police officers in Minneapolis on May 25, 2020. A day later, video footage of Floyd’s murder, in which police officer Derek Chauvin is seen pressing his knee on Floyd’s throat while ignoring Floyd’s protestations that he could not breathe, was posted on Facebook and subsequently went viral. In response to the video, a large-scale protest against racism and police brutality erupted across the globe.
All three of these images, which depict individuals’ violent deaths, are shocking. Yet despite their gruesome content, each image was widely circulated by online viewers. The rapid, online circulation of these images, and others like them, is enabled by the social function that these images are assigned by disparate online communities. In the cases of Kurdi and Floyd, the corpse-images were used as instigators for political protest; in the case of Catsouras, the images were used as a morbid laughing stock. While the latter is arguably more violent—because it is more dehumanizing of the subject of the images—it is vital to note that none of the three depicted individuals were alive when their corpse-images began to circulate on the Internet. None of the three depicted subjects, therefore, had any agency in the presentation and circulation of their images. Whether a corpse-image is used as an emblem for an activist movement or memeified as a joke, each of these examples relies upon the lack of autonomy of its depicted subject. Both types of uses therefore raise significant ethical questions.
In the Catsouras case, online users assumed ownership over the image of someone’s death and felt at liberty to detach it from its original context—a violent accident—for the purposes of dark entertainment. Few people would consider it acceptable to joke about such a tragedy in a public (face-to-face) setting, let alone to taunt and harass surviving family members in person. What, then, enables this unethical approach toward the corpse-image in the online world?
American media scholar Charles Ess offers a possible answer. In his influential book Digital Media Ethics (2009), Ess argues that online behavior in the age of new media networks is subject to “distributed responsibility.” With this term, Ess refers to
the understanding that, as these networks make us more and more interwoven with and interdependent upon one another, the ethical responsibility for an act is distributed across a network of actors, not simply attached to a single individual.9
In other words, Ess argues that individual users within new media networks consider themselves to be part of an informal, virtual collective. In that capacity, users share a collective responsibility for the actions they perform within their network. Ess contrasts this new form of responsibility with more traditional interpretations of ethical responsibility. As he explains,
while we in the West recognize that multiple factors can come into play in influencing an individual’s decision … we generally hold individuals responsible for their actions, as the individual agent who both makes decisions and acts independently of others.10
Given ubiquitous digitization, this more traditional understanding of ethical responsibility as a matter of individual decision-making and agency is increasingly in tension with shared forms of responsibility.11 According to Ess, the task of the new media ethics scholar is then not so much to define a separate framework of ethical thinking exclusively for application within digital-media contexts, but rather to understand how the particular features of digital contexts influence the ways in which people exercise ethical thinking and action within them.
Work by scholars such as Ess suggests that the ethical framework that guides actions of online users does not necessarily match the ethical framework that these same people deploy in their offline encounters. In his book, Ess describes the practice of illegally downloading music. Offline, an individual might not even consider stealing an album from a record store, but online the barrier between “legal” and “illegal” appears to be much easier for many people to cross. Online, users often do not feel individually responsible, considering themselves instead to be invisible parts of a large, ungoverned, and anonymous collective that is undetectable and, therefore, “unpunishable.” Although it is a myth that it is always easier to commit crimes online than offline—internet protocol (IP) addresses and other data by which users are tracked prove as much—the sheer illusion of invisibility, made possible by the cloak of distributed responsibility, may be enough for a user to push ethical boundaries in terms of agency, responsibility, and ownership.
The Catsouras case exemplifies the vulnerability of a digital object in the hands of viewers who act online with a notion of distributed responsibility. Like the music files discussed by Ess, online images are subject to an apparently global sense of ownership—or, alternatively, a sense that digital objects have no owners at all. As the online afterlife of Nikki Catsouras—and its traumatic effects on her surviving relatives—demonstrates, the content of an image has no bearing on the degree to which it may be memeified; a highly sensitive, explicit image of a human corpse is as susceptible to appropriative practices as any other visual material. One of the risks of distributed responsibility, then, is that users tend to become less careful in their personal use of a digital object, and less inclined to recognize that an object’s easy shareability can contribute to misuse or downright violence.
At the same time, it is important to register that the perceived anonymity of online activity—as put forward by Ess’s concept of distributed responsibility—is not the only factor that plays into the ethical (or unethical) decisions of individual users. As I will discuss below, the sociopolitical identity of the memeified subject can also inform the (harmful) ways in which various images are circulated.
The wide circulation of the Catsouras case file has become a case-in-point for many debates about privacy, intimacy, and online media. In a column for the Columbia Journal of Gender and Law, American legal scholar Mary Anne Franks evokes Catsouras to address the potential risks of sharing images of someone’s personal life (or death) online. Franks sees the Catsouras case as an example of the “unwilling avatar”: a paradigm that involves the invocation of “individuals’ real bodies for the purposes of threatening, defaming, or sexualizing them without consent.”12 The unwilling avatar, Franks contends, is a by-product of what she refers to as “cyberspace idealism,” or “the view of cyberspace as a utopian realm of the mind where all can participate equally, free from social, historical, and physical restraints.”13
Considering the emotional impact of the Catsouras case on her surviving relatives, to regard digital sites as places “free” from social, political, or historical implications is woefully naive at best, and can cause real-world harms. Regardless of the perceived invisibility made possible by distributed responsibility, the digital appropriation of human bodies does not exist separately from offline realities. The unwilling avatar, as Franks describes it, points to a form of abusive power and violence that can cause significant and harmful social effects, even though it is enacted on seemingly disembodied or ethereal digital platforms. That the avatar in question is “unwilling” signals a lack of consent, which may be due to a power imbalance. When an image of a corpse is appropriated online for an end that the depicted person or their next of kin never agreed to, what is at stake is a denial of their agency. Such denials, moreover, may be grounded in structural inequalities of gender, race, class, and age that persist with fervor online.
In this respect, it is not irrelevant that the Catsouras case concerns the dead body of a teenage girl. Although the online abuse of her digital corpse is an extreme example, it is part of a more structural asymmetry in (online) power. There is a persistent pattern of woman-targeted harassment and misogyny in online cultures, as observed by media scholars such as Emma A. Jane, Eugenia Siapera, and Jacqueline Ryan Vickery.14 The misogynist underpinnings of this type of harassment, Franks argues, show that “only certain individuals enjoy the mythic degree of liberty and freedom from physical restraint touted by cyberspace idealists, while others experience a loss of liberty and a re-entrenchment of physical restraints already unequally imposed upon them in the offline world.”15 Catsouras’s gender (female) and age group (teenager) in “real life” cannot be disentangled from the way in which images of her corpse were instrumentalized online. Regarding Catsouras’s age at the time of her death, it is also significant that, at eighteen, she was an adult teenager and not legally considered a minor. Had she been a minor, her parents might have had additional legal grounds on which to appeal for the removal for the images, such as the Children’s Online Privacy Protection Rule (COPPA).16 But as a legal adult, Catsouras could not be protected by such legislation.
In her article, Franks discusses the concept of the unwilling avatar in the context of deliberate harassment and abuse. But not all mobilizations of digital bodies as unwilling avatars are malignant. Indeed, as the highway patrol agents’ stated reason for distributing the images of Catsouras’s corpse demonstrates, people who upload and share images of corpses may have a pedagogical or even activist goal in mind. The political impact of the corpse-images of Kurdi and Floyd—who, as a child and a man of color, are also marginalized subjects—reminds us that corpse-images may bring about a range of social effects. Although the effects of the images’ circulation may vary, however, in each case online images of unwilling avatars arise from moments of violence (even death), and depict people who are unable to exercise any rights regarding the uses to which such images will be put.
The variety of intentions behind the circulation of corpse-images such as those of Kurdi, Floyd, and Catsouras in turn offers a critical revision to Ess’s concept of distributed responsibility as discussed above. Individuals’ intentions and contexts of use also matter. Consider, for example, the popularity of gore blogs among veterans of the war in Iraq. Swedish media scholar Kari Andén-Papadopoulos found that for this particular group of Internet users, shocking images of corpses may serve a therapeutic purpose. In Andén-Papadopoulos’s analysis, the war veterans did not consume such images as sadistic entertainment but rather sought such material “primarily as symptoms of an affective reliving of traumatic war experiences, serving at once to authenticate and cancel out a hurtful reality.”17 The reasons why individuals seek out such images—for information, catharsis, entertainment, or revenge—impacts the moral valence of their engagement with such images.
In the case of Nikki Catsouras, online users largely circulated and engaged with the corpse-image for the purpose of shocking entertainment. Unlike the images of Kurdi and Floyd, both of which were largely distributed for their political impact, Catsouras’s dead body was memeified, that is, reduced to an online joke, with a new derogative alias (“Porsche Girl”) to match. In the process, Catsouras was implicitly presented as a subject that was less “grievable” and therefore seemingly less human than others.18
Several factors combined to turn forensic photographs of Nikki Catsouras’s death into widely circulating memes. Some of those factors were medium-specific, such as many online users’ notion of distributed responsibility; others were more broadly sociopolitical, including endemic misogyny in offline as well as online settings. The Catsouras example raises additional questions regarding psychology and legality. Clearly, sharing leaked forensic photographs of corpses for the purpose of spreading an online joke is socially condemnable. What psychological factors enable adult users to do so anyway? And what are the legal consequences of this practice?
In her work on Internet spectatorship, British media scholar Sue Tait argues that online shock seekers, such as the people who viewed and circulated Catsouras’s corpse-image, tend to be well aware of the moral implications of their online behavior. These users, Tait found, generally “acknowledge that their desire to look at imagery of death and body horror breaches cultural taboos” and “express their pleasures and anxieties around looking, or attempt to legitimize their looking by drawing on the discourses of access to the ‘real’ and ‘truth-telling’ promoted by the site.”19 The users whom Tait observed for her study tried to justify their interest in shocking online images by arguing that it was a way for them to stay informed about “reality.” Tait argues, however, that this justification does not necessarily excuse the users’ actions. In particular, users’ purported desire to understand “the real” usually did not extend to the difficult work of understanding the larger context from which the shocking images had originally come.20
Likewise, most users who encounter and share Catsouras’s corpse-image appear to treat the image as a source of pleasure. Most users fail to acknowledge that their pleasure is derived from objectifying a victim rather than empathizing with people’s suffering.21 The ethical risk of memeifying a corpse-image, then, lies in the user’s substitution of shock for empathy, a reaction that objectifies the depicted body without acknowledging its “real” context (war, abuse, illness) while still being aware of this “real” context—as it is the trauma of this reality that gives the corpse-image its shock factor.
Regardless of online users’ intentions, moreover, the memeification of a human corpse has significant real-world consequences. Beyond the pain of encountering their daughter’s memeification via malicious emails, the Catsouras family also had to contend with a protracted effort to try to stop the circulation of the images. Considering the resources that are routinely deployed toward tracking down copyrighted material and removing it (at least temporarily) from the Internet these days, one might wonder: why is it so difficult to remove an image of a human corpse from the online sphere?
After learning that the images of their daughter had been uploaded by two California Highway Patrol (CHP) employees, the Catsourases brought a lawsuit against the CHP as well as the two responsible staff members individually. A five-year legal battle ensued, which the Catsourases eventually won; they were awarded a settlement of more than two million (US) dollars. At the time, journalist Rick Rojas covered the court case for the Los Angeles Times and reported that the terms of the settlement “might finally allow [the Catsourases] some closure.”22 Although the legal victory was a relief to the Catsouras family, however, it did not change the fact that images of their daughter’s dead body were—and still are—widely available in online image databases and search engines.
In a 2014 retrospective of the case for the New Yorker magazine, Jeffrey Toobin investigated the legal implications of Nikki Catsouras’s memeification. In particular, he noted the difference between American and European jurisdiction in this regard. “In Europe, the right to privacy trumps freedom of speech; the reverse is true in the United States,” he wrote. As Toobin reported, the European Court ruled in the spring of 2014 that
all individuals in the countries within its jurisdiction had the right to prohibit Google from linking to items that were “inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they were processed and in the light of the time that has elapsed.”23
The European Court’s decision, Toobin observed, was met with critical responses “especially in the United States and Great Britain,” in that it “could undermine press freedoms and freedom of speech” (as cited by the New York Times) and would be “unworkable in practice” (as cited in a 2014 report from a committee of the British House of Lords). Working within the US legal system, Toobin noted, “the Catsouras family had no means of forcing Google to stop linking to the photographs.” Although many websites did take down the photos after the Catsourases’ lawyer Keith Bremer asked them to, such manual attempts to get the image deleted have proven no match for the Internet’s tenacious reach and memory.24
The European Union confirmed the right to be forgotten in May 2014, and a version of such a right was included within the General Data Protection Regulation (GDPR), enacted by the European Union in 2018.25 Although no comparable laws or rights are recognized within the United States at the federal level, some states have begun to pursue similar legal responses. California, for example, adopted the California Consumer Privacy Act (CCPA) in January 2020, which enshrines, among other privacy protections, “the right to delete personal information,” that is, the right to be forgotten.26 Unfortunately for the Catsouras family, however, this law was not yet in place at the time of their lawsuit; even today, the new law might not provide effective relief in cases like theirs, given the wide circulation of the “Porsche Girl” images since 2006. Legal efforts to remove the “Porsche Girl” meme thus illustrate that there is no one-size-fits-all solution to the circulation of harmful images. Without sufficient legal standards—which could span multiple jurisdictions—or more strict and enforceable user regulations for various online platforms, the stark challenge of controlling the malignant circulation of sensitive images remains.
Examples like the Nikki Catsouras case reveal several significant ethical challenges arising from memeified corpse images. Such images can circulate broadly online while the depicted subjects—having been transformed into “unwilling avatars”—remain powerless to exercise any control or agency over the uses to which such images are put. Even next-of-kin, who can exercise rights over the use and reproduction of a deceased family member’s effects in many contexts, have limited legal rights (in many jurisdictions) and few practical means with which to control the exploitation of sensitive materials online. Under the meme-guise of “Porsche Girl,” Nikki Catsouras became shock fodder on a myriad of gore blogs, and a means for trolls to torment her surviving family.
The ethical issues that surround online corpse-images thus do not stop at the question of under which circumstances, and for which reasons, it is “right” or “wrong” to circulate images of corpses. Rather, they extend to questions about responsibility, agency, and ownership. Who (or what entity) is considered to be responsible for a memeified corpse, and who owns it? And who (or what entity) should be allowed to circulate such images online, to control their circulation altogether, or to prevent certain meanings and affects from being attributed to them? In his article for the New Yorker, Jeffrey Toobin observed that in many places (including the United States), the only way to try to maintain control over an image is to register it as copyright. But if such copyright claims cannot be attained—as was the case for the Catsouras family—the image will remain up for grabs.27
Recent developments suggest new ways of acquiring copyright over online images, including nonfungible tokens (better known as NFTs). These digital assets, made possible by blockchain technology, are now used to commodify images—including memes—into registered possessions. To the uninitiated, such possessions may not seem particularly valuable, yet NFTs now commonly sell for thousands (and sometimes even millions) of dollars. The World Economic Forum reports, for example, that former Twitter CEO Jack Dorsey “sold an NFT of his first tweet for the equivalent of USD 2.5 million.”28 Might new technological developments like blockchain-enabled NFTs be able to curb the harmful effects of online memes?
Having been stymied in previous efforts to halt the circulation of sensitive content online (much like the Catsouras family), at least one person has turned to NFTs. In 2015, broadcast journalist Alison Parker was fatally shot while conducting a live interview for Virginia-based television station WDBJ. The gunman, Vester Lee Flanagan II, recorded the murder on his phone and later posted the footage on Facebook. The video subsequently went viral, and clips of the most shocking moment—Parker’s death—were shared beyond Facebook as well, making the footage accessible even after the social media platform removed it. Like the Catsouras family, Parker’s father, Andy Parker, tried unsuccessfully to get the clips removed from the Internet. However, Parker had been following the rising popularity of NFTs and, in a self-admitted “act of desperation,” minted the footage of his daughter’s death as an NFT, thereby attributing the ownership of the video to himself.29
This novel strategy still faces legal hurdles, not least because buyers of NFTs do not automatically attain ownership of the images in question. Instead, as intellectual property scholar Andrew Guadamuz explains, “they are simply buying the metadata associated with the work.”30 Considering the significant amount of money invested in NFTs, it is not surprising that the surge in NFT sales has led to a rapid rise in copyright lawsuits as well. On March 9, 2022, US President Joseph Biden issued an “Executive Order on Ensuring Responsible Development of Digital Assets,” which includes legal guidelines for the transfer of ownership of digital assets.31 Whether the new legal-technological regime of NFTs will ultimately prove capable of reducing the circulation of harmful images like the “Porsche Girl” meme remains far from clear.
Beyond specifics of new commodities like NFTs, we might ask: is individual ownership the most effective or ethical way to address the harmful effects of online images? Other recent efforts include broader regulatory and legal responses. For example, the European Union has recently undertaken additional steps regarding the right to be forgotten. In addition to the existing GDPR, the European Union will implement the Digital Services Act (DSA) and the Digital Marketing Act (DMA) in January 2024. These two acts, which were first announced in April 2022, enforce new rules for Big Tech companies such as Meta (Facebook, Instagram) and Google (which also owns YouTube) “to assess and manage systemic risks posed by their services, such as [the] advocacy of hatred and the spread of disinformation.”32 According to the European Commission’s policy package, the goal of the DSA and DMA is to improve online safety in the European (as well as the global) online market.33
The work of scholars such as Charles Ess and Mary Anne Franks suggests that such technological, legal, and regulatory changes will need to be accompanied by broader social changes—regarding norms of responsibility and agency for online conduct—to ultimately succeed in curbing the (mis)attribution of sensitive content online.
Do you think a digital object can be someone’s property? If so, can we distinguish between owning images of inanimate objects, and images of human bodies? Consider, for example, the recent rise of nonfungible tokens (NFTs). Does this trend affect your reasoning?
How do race, gender, or age affect the way that certain bodies are memeified online? Have you noticed any differences in how images of human bodies are circulated based on the demographic category of the subject? Do these differences depend on the particular communities in which the images are circulated as memes, or do other factors play a role as well?
Do you think there should be more content moderation on the Internet (perhaps making further use of artificial intelligence techniques)? Or should the occasional encounter with extreme images remain a part of the online experience?
The shock seekers whom Sue Tait encountered in her research defended their fascination with gory imagery on the grounds of curiosity and human interest. Other studies have shown that people—such as the war veteran participants in Andén-Papadopoulos’s study—sometimes seek out extreme content for therapeutic reasons.34 Do you think there is an ethical difference between looking at images of the dead for entertainment or for therapeutic reasons? Why or why not?
When a sensitive image is uploaded online, does the intention of the uploader matter? Can you think of scenarios in which the circulation of the Catsouras images would be ethically acceptable? What circumstances would complicate these scenarios?
Amnesty International. “European Union: Digital Services Act Agreement a ‘Watershed Moment’ for Internet Regulation.” Amnesty International, April 23, 2022. https://www.amnesty.org/en/latest/news/2022/04/european-union-digital-services-act-agreement-a-watershed-moment-for-internet-regulation/.
Andén-Papadopoulos, Kari. “Body Horror on the Internet: US Soldiers Recording the War in Iraq and Afghanistan.” Media, Culture & Society 31, no. 6 (2009): 921–38. https://journals.sagepub.com/doi/10.1177/0163443709344040.
Andrews, Lori. “Privacy of Information.” In I Know Who You Are and I Saw What You Did: Social Networks and the Death of Privacy, 121–36. New York: Simon & Schuster, 2011.
Avila, Jim, Teri Whitcraft, and Kristin Pisarcik. “Family’s Nightmare: Daughter’s Accident Photos Go Viral.” ABC News, August 8, 2008. https://abcnews.go.com/TheLaw/story?id=5276841
Bennett, Jessica. “A Tragedy That Won’t Fade Away: When Grisly Images of Their Daughter’s Death Went Viral on the Internet, the Catsouras Family Decided to Fight Back.” Newsweek, April 24, 2009. https://www.newsweek.com/one-familys-fight-against-grisly-web-photos-77275.
Butler, Judith. Frames of War: When Is Life Grievable? London, UK: Verso, 2009.
Dunbar-Hester, Christina. “Hacking Technology, Hacking Communities: Codes of Conduct and Community Standards in Open Source,” MIT Case Studies in Social and Ethical Responsibilities of Computing, Spring 2021. https://doi.org/10.21428/2c646de5.07bc6308.
Ess, Charles. Digital Media Ethics. Cambridge, UK: Polity, 2009.
Franks, Mary Anne. “Unwilling Avatars: Idealism and Discrimination in Cyberspace.” Columbia Journal of Gender and Law 20, no. 2 (2011): 224–61. https://ssrn.com/abstract=1374533.
Guadamuz, Andrew. “What Do You Actually Own When You Buy an NFT?” World Economic Forum, February 7, 2022. https://www.weforum.org/agenda/2022/02/non-fungible-tokens-nfts-and-copyright/.
Heisler, Kevin. “Nikki Catsouras, Death on Highway, Search Engine Victim.” Search Engine Watch, July 1, 2008. https://www.searchenginewatch.com/2008/07/01/nikki-catsouras-death-on-highway-search-engine-victim/.
Jane, Emma A. Misogyny Online: A Short (and Brutish) History. Thousand Oaks, CA: SAGE, 2017.
Lima, Cristiano. “To Expunge the Death of His Daughter from the Internet, a Father Created an NFT of the Video.” Washington Post, February 22, 2022. https://www.washingtonpost.com/technology/2022/02/22/expunge-his-daughters-murder-internet-father-created-an-nft-grisly-video/.
Miller, Vincent. “A Crisis of Presence: On-line Culture and Being in the World.” Space and Polity 16, no. 3 (2012): 265–85. https://doi.org/10.1080/13562576.2012.733568.
Rizzo, Jessica. “The Future of NFTs Lies Within the Courts.” Wired, April 3, 2022. https://www.wired.com/story/nfts-cryptocurrency-law-copyright/.
Rojas, Rick. “CHP Settles over Leaked Photos of Woman Killed in Crash.” Los Angeles Times, January 31, 2012. https://www.latimes.com/archives/la-xpm-2012-jan-31-la-me-chp-photos-20120131-story.html
Siapera, Eugenia. “Online Misogyny as Witch Hunt: Primitive Accumulation in the Age of Techno-capitalism.” In Gender Hate Online: Understanding the New Anti-Feminism, ed. Debbie Ging and Eugenia Siapera, 21–44. New York: Palgrave Macmillan, 2019.
Tait, Sue. “Pornographies of Violence? Internet Spectatorship on Body Horror.” Critical Studies in Media Communication 25, no. 1 (2008): 91–111. https://doi.org/10.1080/15295030701851148.
Toobin, Jeffrey. “The Solace of Oblivion: In Europe, the Right to Be Forgotten Trumps the Internet.” New Yorker, September 22, 2014. https://www.newyorker.com/magazine/2014/09/29/solace-oblivion.
Vickery, Jacqueline Ryan. “This Isn’t New: Gender, Publics, and the Internet.” In Mediating Misogyny: Gender, Technology, and Harassment, ed. Jacqueline Ryan Vickery and Tracy Everbach, 31–50. New York: Palgrave Macmillan, 2019.
de Zeeuw, Daniël. Between Mass and Mask: The Profane Media Logic of Anonymous Imageboard Culture. Amsterdam, NL: University of Amsterdam, 2019. https://hdl.handle.net/11245.1/c0c21e79-4842-40ef-9690-4d578cca414b.