Behind the Screen: Content Moderation in the Shadows of Social Media


By Sarah T. Roberts, New Haven, CT: Yale University Press, 2019, 280 pages, $30.00 hardcover (ISBN: 978-0-3002-3588-3)

Behind the Screen cover


“Things aren’t great, Internet. Actually, scratch that: they’re awful.”1 This quote from an open letter to the Internet published by Wired succinctly summarizes the current state of online affairs, including broken expectations and unfulfilled promises once pegged upon the development of Internet-connected information and communication technologies. That the Internet brought many wonderful services is indisputable. What causes incessant alarm is the dark side of it, namely, “the pornographic, the obscene, the violent, the illegal, the abusive, and the hateful” that circulates on social media platforms.2


Launched to further correct “a collective myopia” (61) regarding technology, Sarah T. Roberts’s study inquires into the practices of what she calls “commercial content moderation” (CCM). Her exploration adopts a very strict definition of CCM as “a job, a function, and an industrial practice that exists only in this context and could only ever exist in it” (15). By separating online content moderation from the practices of censorship, the author insulates her study from the political powerlines that it inevitably crosses and proceeds to find out who does content screening for tech companies, what their labor looks and feels like, and what it entails.


From the author’s own experience with digital technology and her observations that the digital hides--or, worse, erases--human traces of its making and maintenance, the narrative moves to present three groups of content moderators: in-house moderators employed by MegaTech (a pseudonym) in Northern California; “boutique” moderators (41) hired to manage the clients’ online presence (represented by a Canadian expat in Mexico and a former contractor for a digital news company); and workers in business process outsourcing services in Manila, the Philippines. Freelancers and those viewers who voluntarily flag offensive and inappropriate content on “microlabor websites” (46) are mentioned only briefly, since the main focus of the study is on “professional people who are paid to screen content” on social media platforms (1). The interviews with content moderators unearth several common themes across locations where the task of moderation is performed: precarious work, pervasive stress and burnout, job insecurity, disparity in status and treatment--all triangulating on a conclusion that CCM is invisible by design and that content moderators are invisible by association and hence easily disposable as workers, since their one-year contract allows one renewal only (82).


Extended quotations from the interviews are the strongest element of the book, vividly detailing the content moderation experiences of moderators that the platforms purposefully keep hidden from the public eye. Together, these corroborate evidence that in its “rote, repetitive, and endless work tasks” (92) content moderation frequently resembles labor on a factory’s assembly line (92). Still, out of the many dramatic images and takeaways that this book offers, two stand out. One is that the romantic notion of the Internet as a free-speech zone is largely blind to the reality of the Internet as “a site of control, surveillance, intervention and circulation of information as a commodity” (6). And the other is that without such content moderation, the Internet “would be largely unusable for most people,” as it would likely turn into a “cesspool” of aggression and negativity (153); consequently, the companies will continuously need human screeners and “data janitors.”3


With insights from six interviewees, the book presents an excellently executed act of raising awareness about the work done behind the screen. However, because the analysis is light and accessible to the general audience, it misses a chance to build an argument in support of advocacy for change in the conditions under which much digital information labor is performed or for a broader public discussion of policies, practices, and patterns of the gig economy. That such a discussion is urgently needed is evident in the growing number of studies that reveal the legacy of inequality and discrimination in many technologically restructured service industries such as care, cleaning, and delivery.4


While the author has admitted that the book would be woefully inadequate if taken for the whole story, the study’s apolitical positioning in the context where tech companies are accumulating much economic and cultural power works against it. Probing neither into the dynamics of that power nor into the consequences of its accumulation or deployment, the study falls short of asking hard questions and seeking exhaustive answers, hurdles notwithstanding. Some of the most pressing questions would start with “why”: Why do companies persist on keeping content moderation secret? Why aren’t the medical experts more vocal on the impact of disturbing content on workers? Why interviews instead of ethnography with fieldwork in CCM? And, after all, why this sample of content moderators?


Moving beneath the political radar, the book skirts major debates in which practices of content screening are implicated, touching only briefly on free speech, control and surveillance, or the global span of freelance economy and contingent labor. This choice is not an oversight but a strategy (explained in the opening pages of the book) that relegates censorship into the hands of political institutions and, inadvertently, clears political implications of similar activities performed by and on behalf of corporate interests. Granted, Professor Roberts might not have wanted to write a political pamphlet or a manual for digital activists. Her contribution to the understanding of how the social media maintain such a firm footing in our lives is less politically charged. With Behind the Screen, Sarah T. Roberts adds voices from the trenches that have rarely been heard in debates on the digital age that speak to a myriad of legal, psychological, relational, and human rights concerns to recent studies of invisible labor on social media, for instance, Tarleton Gillespie’s Custodians of the Internet (2018) and Mary Gray and Siddharth Suri’s Ghost Work (2019).5 These voices are urgently needed in policy discussions of governance, accountability, and greater transparency of the social media platforms. Having completed a tour behind social media screens, readers are invited to investigate further the role of the human in what is popularly perceived as automated, algorithmic, and impersonal and to consider the information and information management skills that they would need to acquire to survive and succeed in the knowledge economy.


Natalia Kovalyova, University of Texas at Austin


Notes


1. “Dear Internet: It’s Time to Fix This Mess You Made,” Wired, August 24, 2016, accessed July 18, 2019, https://www.wired.com/2016/08/open-letter-to-the-internet.
2. Tarleton Gillespie, Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (New Haven, CT: Yale University Press, 2018).
3. Lilly Irani, “Justice for ‘Data Janitors,’” Public Books, January 15, 2015, accessed August 6, 2019, https://www.publicbooks.org/justice-for-data-janitors/.
4. Julia Ticona, Alexandra Mateescu, and Alex Rosenblat, “Beyond Disruption: How Tech Shapes Labor across Domestic Work and Ridehailing,” Data & Society, June 26, 2018, accessed August 7, 2019, https://datasociety.net/output/beyond-disruption.
5. Gillespie, Custodians of the Internet; Mary L. Gray and Siddharth Suri, Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass (Boston: Eamon Dolan Books, 2019).