How trauma hits content moderators

Ever wonder who handles all the reported contents on Facebook? It would be amazing if something like artificial intelligence is used.

But unfortunately, even with how advanced technology is in our generation, that hasn’t been invented yet.

It is still being handled in the old school way. The big corporations like Facebook, hire third party companies from all over the world that would help them filter out all contents posted on their platform.

Yes. Actual people are getting paid to watch, view, and read all sorts of reported sensitive content on Facebook. And for how much? Well, $15 per hour.

They are called content moderators a.k.a. “The Cleaners”. They’ll be spending 6 hours a day in front of their desktops scanning reported posts and sweeping it off Facebook. They have to review if the posts are meeting the current Facebook policies. I said current because these policies are constantly changing every single day.

And in those 6 hours, they encounter an average of 400 posts. That’s 400 disturbing posts! Imagine how these people feel watching animals getting tortured or babies getting abused! Not only that, it is ALL sorts of violence, harassment, and hate. It could be graphic or written.

This affects their mental health big time! They can’t sleep at night, they get snappy, they have unstable emotions and even suicidal thoughts.

Most of them are traumatized that they even remember the very first video they had to watch. And that is how huge and concerning this toxic work is!

Now, do you think their getting paid fair to suffer like this? They need professional help and it is disappointing how this issue hasn’t been receiving the attention it needs.

Leave a Reply

Your email address will not be published.