• News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Former Facebook moderator shares 'traumatizing' things they saw in the job

Home> Social Media> Facebook

Published 14:28 20 Dec 2024 GMT

Former Facebook moderator shares 'traumatizing' things they saw in the job

One YouTube video reveals the horrifying nature of social media content moderation

Harry Boulton

Harry Boulton

One former Facebook content moderator has shared the 'traumatizing' content they had to sift through while working at the social media giant, in a YouTube video many have deemed to be 'horrifying'.

The early days of social media were all about sharing your life, whether it be pictures of your dog, your lunch, or your sunny holidays, and there was perhaps no better platform for that than Facebook.

Originally set up as a means to connect college students, it eventually expanded out to everyone - allowing them to show their life to their friends and family, and perhaps rebuild friendships that were lost to time.

Advert

There was unfortunately a much darker side to the social media platform though, and it was down to content moderators to sift through 'traumatizing' content that included hate speech, pornography, and brutal violence.

One former employee of Facebook spoke to VICE anonymously in a YouTube video, documenting their experience in the role and the 'horrors' that they faced on a daily basis, and it's an experience that you really have to hear to believe.

Boiling down the experience to the basics, they describe the job as follows: "I would basically just come in, find out what my target is for the day, press the button, just go. And the first piece of content, boom, it's just there on the screen in front of you.

Advert

You take a look, you make a decision, you press a couple of buttons, it goes away, and the next one loads. And the next one. And the next one."

While that might sound like any old boring office job, it's far from a pleasant or even mundane experience when you're faced with illegal and often deplorable content, one after another.

"You are seeing dead bodies," they describe, "and murders, or people celebrating having killed somebody, dogs being barbequed alive."

Part of the trouble, outside of seeing content that is deeply traumatizing, is the nature of evaluation and moderation. The subject outlines: "I think it would be easier to deal with the images if you weren't having to think about them so deeply."

Advert

Facebook allegedly has incredibly strict and concretely defined moderation rules that clearly separate between aspects like harm, injury, and violence, and are used to determine whether something falls under hate speech or not, and it is up to the moderators to consume potentially traumatizing content with intent in order to evaluate it within these rules.

Facing traumatic content is allegedly a daily occurrence in the life of a Facebook content moderator (Artur Widak/NurPhoto via Getty Images)
Facing traumatic content is allegedly a daily occurrence in the life of a Facebook content moderator (Artur Widak/NurPhoto via Getty Images)

This particular former moderator has been diagnosed with PTSD that stems from his time at Facebook and the content that he had to deal with on a daily basis, and at the time had a lawsuit pending against the social media company in regard to the treatment that he faced.

Facebook has previously had to pay $52 million in settlement fees to current and former moderators as compensation for issues faced on the job, and just only recently have reports surfaced in the Guardian that 140 Kenyan Facebook moderators have been diagnosed with PTSD.

Advert

It certainly brings into question the ethical considerations for a job like this and the role of social media companies in designing moderation systems that bypass harm to individuals.

Featured Image Credit: Vice/YouTube
Facebook
Social Media
Youtube

Advert

Advert

Advert

Choose your content:

2 days ago
  • 2 days ago

    MrBeast issues furious response to claim 100 wells he built in Africa are no longer operational

    The YouTuber called the claims 'straight up lies'

    Social Media
  • 2 days ago

    YouTuber covering Diddy trial reveals rapper allegedly mouthed three threatening words to her in courtroom

    Stephanie Soo recalled the incident on social media

    Social Media
  • 2 days ago

    Instagram fans slam controversial new layout changes following update 'nobody asked for'

    The new update will completely change the way your grid looks

    Social Media
  • 2 days ago

    These two 'little known' Instagram settings could expose cheating in your relationship

    These settings will help you spot any infidelity

    Social Media
  • YouTuber who allegedly attended one of Diddy’s parties describes X-rated acts he saw there
  • Man visits 'most suicidal town on Earth' in heartbreakingly touching video
  • Person receives written warning from ChatGPT after they were repeatedly 'sexting' the chatbot
  • Dr Disrespect loses tens of thousands of followers after finally issuing statement revealing real reason he was banned from Twitch