Unknown to most, moderators play a key role in the Facebook ecosystem. Here's who they are and what exactly they do
Much of what you see on your Facebook wall goes through the strict scrutiny of artificial intelligence algorithms that, based on our preferences and past choices, show us only and exclusively posts that might be of interest to us in some way. This, however, is only part of the story.
The selection of news is also entrusted to teams of dozens and dozens of people in the flesh who every day check hundreds of posts of all kinds and decide their fate. These are the Facebook moderators, professionals who, while not being directly employed by Mark Zuckerberg's partner network, collaborate in a fundamental way to define the "quality standards" of the community. But, exactly, who are the moderators of Facebook and what do they do? Let's find out together.
Who are the Facebook moderators and what do they do
When we talk about Facebook moderators we refer to a large group of consultants - it is estimated that there are between 7,000 and 10,000, although there are no official data on their number - whose task is to ensure that everything runs smoothly on the platform. Have you ever reported a group, a post or a Facebook Page because, in your opinion, they didn't respect the conditions of use of the social network? Well, these posts (along with many others intercepted by Facebook algorithms) are analyzed by moderators who, following an operating manual, evaluate your report and act accordingly.
This, however, could be defined as the "simple part" of their work. Facebook moderators, in fact, have to view far more gory images and posts on a daily basis: live rapes, murders and suicide attempts, incitement to violence, child pornography and much more. And it is this workload, abnormal and abnormal in many ways, which leads Facebook moderators to have several problems at the psychological level.