Dante’s Digital Inferno: Content Moderators’ Class Action Against Facebook Feeds from Hell

By: Anthony E. Gambino

Have you ever wondered why your Facebook feed is free from certain offensive content? Notably absent are unwanted torrents of child pornography, cartel beheadings, or other depraved posts depicting the worst the internet has to offer. Well, you can thank people like Selena Scola and thousands of current and former Facebook content moderators.[1] They have devoted countless hours screening content and shielding Facebook users from digital atrocities. However, for some, this has caused a psychological and physical toll. Because “[Facebook] require[d] its content moderators to work under conditions known to cause and exacerbate psychological trauma . . . [and, therefore] violate[d] California law,” Plaintiffs filed suit in Scola v. Facebook, Inc.[2] The parties settled for $52 million.[3]

In 2018, plaintiff Selena Scola developed PTSD after screening for graphic and violent imagery while moderating Facebook newsfeeds. Scola and other moderators accused Facebook of failing to provide a safe workplace for the thousands of contracted third-party content moderators. Mainly, they sought damages and an order requiring Facebook to establish a fund to maintain a testing and treatment program for content moderators. This program would enable moderators access to ongoing medical testing, monitoring, and any necessary medical and psychiatric treatment.[4]

Because Scola and the others were employees of third-party vendors, they were not Facebook employees and did not have access to the perks that came with that title.[5]  They faced other challenges as well: an annual salary as low as $28,800; the pressure to be near-perfect in moderating content; Facebook’s constant content policy changes; and all the horrific imagery that could rattle the nerves of even the most seasoned internet veteran.[6]

March 14th, 2022 marked the final day to file a claim in Facebook content moderators’ landmark class action lawsuit.[7] In addition to monetary relief, Facebook agreed to implement workplace reforms, such as: “(1) requiring all U.S. Facebook vendors provide on-site coaching with licensed clinicians and standardized resiliency measures and (2) implementing tooling enhancements designed to mitigate the effects of exposure to graphic and objectionable material.”[8] However, Facebook did not have to admit to any misconduct. Prior to the settlement, Facebook’s lawyers argued that a worker’s risk of developing mental health issues could be the result of extrinsic factors including genetics, psychological history, and personal circumstances.[9]

It is apparent that Facebook feels it has an obligation to protect its users from objectionable material. In fact, Mark Zuckerberg stated he believes Facebook has more responsibilities than a typical media company because of the amount of people it reaches. Therefore, Facebook is not simply a technology company, such as a phone company, which need not monitor the content of communications. As a result of Facebook’s concern for content, Facebook has hired 30,000 new employees solely for safety and security, and nearly half of those are content moderators.[10]

Facebook, while not blameless, should be given credit for being receptive to the reforms. Afterall, they too are, to a certain degree, at the mercy of this mercurial digital landscape. To combat this, Facebook uses Artificial Intelligence (AI), in addition to human moderators to screen content. But AI is not yet advanced enough to detect something as uniquely human as  “nuances and wordplay in memes.”[11] As Steven Williams, one the Plaintiffs’ lawyers, aptly said: “Content managers are human beings. They are not disposable.”[12]

Scola v. Facebook, Inc can be seen as a watershed moment for content moderators of all social media platforms. This landmark case is one of the first (and certainly not the last) to be addressed by this burgeoning digital labor force. These technology companies have birthed a workforce that is at the mercy of exploitative tactics, with little legal precedent and regulation to protect those at the frontline safeguarding tech companies’ most important facet besides user data—their content.


[1] Scola v. Facebook Settlement, Content Moderator Settlement, https://contentmoderatorsettlement.com/.

[2] Second Amended Complaint, Scola v. Facebook, Inc., No. 18-civ-05135 (Super. Ct. Cal. filed June 30, 2020) (available at  https://contentmoderatorsettlement.com/Content/Documents/Complaint.pdf).

[3] Scola v. Facebook Settlement, supra note 1.

[4] Complaint, Scola v. Facebook Inc., No. 18-civ-5135 (Super. Ct. Cal. filed Sept. 21, 2018) (available at https://regmedia.co.uk/2018/09/24/scola_v_facebook.pdf).

[5] Casey Newton, THE TRAUMA FLOOR (The secret lives of Facebook moderators in America), The Verge (Feb. 25, 2019), https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona.

[6] Casey Newton, Facebook will pay $52 million in settlement with moderators who developed PTSD on the job, The Verge (May 12, 2020), https://www.theverge.com/2020/5/12/21255870/facebook-content-moderator-settlement-scola-ptsd-mental-health.

[7] Scola v. Facebook Settlement, supra note 1.

[8] Id.

[9] Daniel Wiessner, Judge OKs $85 mln settlement of Facebook moderators’ PTSD claims, Reuters (July 23, 2021), https://www.reuters.com/legal/transactional/judge-oks-85-mln-settlement-facebook-moderators-ptsd-claims-2021-07-23/.

[10] Terry Gross, For Facebook Content Moderators, Traumatizing Material Is A Job Hazard, NPR (July 1, 2019), https://www.npr.org/2019/07/01/737498507/for-facebook-content-moderators-traumatizing-material-is-a-job-hazard.

[11] Facebook to pay $52m to content moderators over PTSD, BBC (May 13, 2020), https://www.bbc.com/news/technology-52642633.

[12] Steven Williams, Facebook Content Moderators’ Safe Workplace Litigation, Joseph Saveri Law Firm (July 2021), https://www.saverilawfirm.com/our-cases/facebook-content-moderators-safe-workplace-litigation/.

Leave a Reply

Your email address will not be published. Required fields are marked *