• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Former Facebook moderator shares 'traumatizing' things they saw in the job

Home> Social Media> Facebook

Published 14:28 20 Dec 2024 GMT

Former Facebook moderator shares 'traumatizing' things they saw in the job

One YouTube video reveals the horrifying nature of social media content moderation

Harry Boulton

Harry Boulton

One former Facebook content moderator has shared the 'traumatizing' content they had to sift through while working at the social media giant, in a YouTube video many have deemed to be 'horrifying'.

The early days of social media were all about sharing your life, whether it be pictures of your dog, your lunch, or your sunny holidays, and there was perhaps no better platform for that than Facebook.

Originally set up as a means to connect college students, it eventually expanded out to everyone - allowing them to show their life to their friends and family, and perhaps rebuild friendships that were lost to time.

Advert

There was unfortunately a much darker side to the social media platform though, and it was down to content moderators to sift through 'traumatizing' content that included hate speech, pornography, and brutal violence.

One former employee of Facebook spoke to VICE anonymously in a YouTube video, documenting their experience in the role and the 'horrors' that they faced on a daily basis, and it's an experience that you really have to hear to believe.

Boiling down the experience to the basics, they describe the job as follows: "I would basically just come in, find out what my target is for the day, press the button, just go. And the first piece of content, boom, it's just there on the screen in front of you.

You take a look, you make a decision, you press a couple of buttons, it goes away, and the next one loads. And the next one. And the next one."

Advert

While that might sound like any old boring office job, it's far from a pleasant or even mundane experience when you're faced with illegal and often deplorable content, one after another.

"You are seeing dead bodies," they describe, "and murders, or people celebrating having killed somebody, dogs being barbequed alive."

Part of the trouble, outside of seeing content that is deeply traumatizing, is the nature of evaluation and moderation. The subject outlines: "I think it would be easier to deal with the images if you weren't having to think about them so deeply."

Facebook allegedly has incredibly strict and concretely defined moderation rules that clearly separate between aspects like harm, injury, and violence, and are used to determine whether something falls under hate speech or not, and it is up to the moderators to consume potentially traumatizing content with intent in order to evaluate it within these rules.

Advert

Facing traumatic content is allegedly a daily occurrence in the life of a Facebook content moderator (Artur Widak/NurPhoto via Getty Images)
Facing traumatic content is allegedly a daily occurrence in the life of a Facebook content moderator (Artur Widak/NurPhoto via Getty Images)

This particular former moderator has been diagnosed with PTSD that stems from his time at Facebook and the content that he had to deal with on a daily basis, and at the time had a lawsuit pending against the social media company in regard to the treatment that he faced.

Facebook has previously had to pay $52 million in settlement fees to current and former moderators as compensation for issues faced on the job, and just only recently have reports surfaced in the Guardian that 140 Kenyan Facebook moderators have been diagnosed with PTSD.

It certainly brings into question the ethical considerations for a job like this and the role of social media companies in designing moderation systems that bypass harm to individuals.

Featured Image Credit: Vice/YouTube
Facebook
Social Media
Youtube

Advert

Advert

Advert

Choose your content:

2 hours ago
9 hours ago
4 days ago
  • NurPhoto / Contributor via Getty
    2 hours ago

    Facebook users set to receive their share of $725,000,000 settlement as payments begin

    You could earn a lot by signing up

    Social Media
  • Leon Bennett / Stringer via Getty
    2 hours ago

    Amanda Seyfried slammed following three-word statement on Charlie Kirk assassination

    The famous actor has offered her own assessment

    Social Media
  • wehatethecold / YouTube
    9 hours ago

    YouTuber goes viral after being caught in viral government coup with 51 deaths

    One clip has racked over 20 million views

    Social Media
  • NurPhoto / Contributor via Getty
    4 days ago

    Elon Musk brands world-famous author a 'liar' over Charlie Kirk claims

    It isn't the first time the pair have beefed on social media

    Social Media
  • David Schwimmer slams Elon Musk in lengthy post following Kanye West's 'ignorant' rant on X
  • Man visits 'most suicidal town on Earth' in heartbreakingly touching video
  • Man uses VR headset to spend 100 days in Grand Theft Auto and becomes ‘his own worst nightmare’
  • Person receives written warning from ChatGPT after they were repeatedly 'sexting' the chatbot