Tech

Facebook is being sued by a content moderator for causing her mental trauma

Selena Scola a content moderator who works at Facebook has filed a class action lawsuit against the company on Friday. Scola believes that the company does not take precautionary measures to protect its employees from the mental trauma caused by the graphic images being uploaded to the social media platform on a daily basis.

Scola who is responsible for reviewing and removing content that violate Facebook’s terms of use, said she is suffering from post traumatic stress disorder because of the “constant and unmitigated exposure to highly toxic and extremely disturbing images at the workplace,” according to her statement

After working at Facebook for about nine months, she began to develop symptoms of stress, anxiety and insomnia before being clinically diagnosed with PTSD.

“Every day, Facebook users post millions of videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder,” according to the report.

Facebook which has about 7,500 content moderators is trying to ensure the moderators’ wellbeing by putting workplace safety standards in place. This includes therapy sessions, changing the way images appear on Facebook and training its employees to recognize PTSD symptoms.

However, the lawsuit outlines that Facebook ignores its safety standards by forcing the moderators to work in “dangerous conditions that cause debilitating physical and psychological harm.”

“Facebook does not provide its content moderators with sufficient training or implement the safety standards it helped develop,” the complaint said. “Facebook content moderators review thousands of trauma-inducing images each day, with little training on how to handle the resulting distress.”

As the lawsuit states, Scola’s PTSD symptoms are triggered by a touch of a mouse, discussing the graphic images she saw on Facebook, or by even entering a cold building.

According to Bertie Thomson, Facebook’s director of corporate communications, the company is “currently reviewing this claim,” and that it recognizes that “this work can often be difficult.”

“That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources,” Thomson adds.