uniladtech homepage
  • News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Former Facebook moderator shares 'traumatizing' things they saw in the job

Home> Social Media> Facebook

Published 14:28 20 Dec 2024 GMT

Former Facebook moderator shares 'traumatizing' things they saw in the job

One YouTube video reveals the horrifying nature of social media content moderation

Harry Boulton

Harry Boulton

google discoverFollow us on Google Discover
Featured Image Credit: Vice/YouTube
Facebook
Social Media
Youtube

Advert

Advert

Advert

One former Facebook content moderator has shared the 'traumatizing' content they had to sift through while working at the social media giant, in a YouTube video many have deemed to be 'horrifying'.

The early days of social media were all about sharing your life, whether it be pictures of your dog, your lunch, or your sunny holidays, and there was perhaps no better platform for that than Facebook.

Originally set up as a means to connect college students, it eventually expanded out to everyone - allowing them to show their life to their friends and family, and perhaps rebuild friendships that were lost to time.

There was unfortunately a much darker side to the social media platform though, and it was down to content moderators to sift through 'traumatizing' content that included hate speech, pornography, and brutal violence.

Advert

One former employee of Facebook spoke to VICE anonymously in a YouTube video, documenting their experience in the role and the 'horrors' that they faced on a daily basis, and it's an experience that you really have to hear to believe.

Boiling down the experience to the basics, they describe the job as follows: "I would basically just come in, find out what my target is for the day, press the button, just go. And the first piece of content, boom, it's just there on the screen in front of you.

You take a look, you make a decision, you press a couple of buttons, it goes away, and the next one loads. And the next one. And the next one."

While that might sound like any old boring office job, it's far from a pleasant or even mundane experience when you're faced with illegal and often deplorable content, one after another.

"You are seeing dead bodies," they describe, "and murders, or people celebrating having killed somebody, dogs being barbequed alive."

Part of the trouble, outside of seeing content that is deeply traumatizing, is the nature of evaluation and moderation. The subject outlines: "I think it would be easier to deal with the images if you weren't having to think about them so deeply."

Facebook allegedly has incredibly strict and concretely defined moderation rules that clearly separate between aspects like harm, injury, and violence, and are used to determine whether something falls under hate speech or not, and it is up to the moderators to consume potentially traumatizing content with intent in order to evaluate it within these rules.

Facing traumatic content is allegedly a daily occurrence in the life of a Facebook content moderator (Artur Widak/NurPhoto via Getty Images)
Facing traumatic content is allegedly a daily occurrence in the life of a Facebook content moderator (Artur Widak/NurPhoto via Getty Images)

This particular former moderator has been diagnosed with PTSD that stems from his time at Facebook and the content that he had to deal with on a daily basis, and at the time had a lawsuit pending against the social media company in regard to the treatment that he faced.

Facebook has previously had to pay $52 million in settlement fees to current and former moderators as compensation for issues faced on the job, and just only recently have reports surfaced in the Guardian that 140 Kenyan Facebook moderators have been diagnosed with PTSD.

It certainly brings into question the ethical considerations for a job like this and the role of social media companies in designing moderation systems that bypass harm to individuals.

Choose your content:

11 hours ago
16 hours ago
a day ago
  • Meta
    11 hours ago

    Meta is officially shutting down its $80,000,000,000 product just five years after launch

    Users will lose access to purchases and avatars

    Social Media
  • YouTube / Marques Brownlee
    11 hours ago

    Marques Brownlee on the one YouTube feature he'd bring back immediately if he was CEO

    This feature was removed half a decade ago

    Social Media
  • Clavicular / Kick
    16 hours ago

    Brutal five-word response from Andrew Callaghan after Clavicular walked out in the middle of interview

    Callaghan recently interviewed the viral 'looksmaxxer'

    Social Media
  • GoFundMe
    a day ago

    Almost $1,000,000 raised for elderly couple who were DoorDashing to make ends meet

    The internet has rallied together so this couple can retire

    Social Media
  • Self-proclaimed 'gooner' reveals shocking reality inside his 'goon cave' in documentary
  • Adult entertainment site recruits unusual job role after Pornhub is banned in 22 US States
  • Pornhub's 2025 year in review reveals 'bleak' detail about the USA
  • Porn report reveals the top three most searched categories in 2025