An investigation by British newspaper The Guardian has uncovered Facebook's moderator guidelines, here's what you can share and what you can't
How Facebook manages its (almost) 2 billion users has always been a mystery. As of today, however, the system is clearer. An investigation by The Guardian newspaper, in fact, has shown the algorithms used by the social to create what we can call "the guide" for what you can post and what is prohibited.
For years the world has been debating the ethical role of one of the most famous and most used social networks like Facebook. The problem of fake news, posts related to racism, cyberbullying and pornography. These are the main enemies for the future of Facebook. Until now, it was not clear to users how the social managed unsuitable or dangerous content. Thanks to the British newspaper, we now know that there are more than 100 internal manuals that create spreadsheets and diagrams that allow moderators to resolve cases of violence, racism, self-harm, pornography, terrorism and bullying.
V revenge porn
The research has brought to light guidelines for handling even posts about cannibalism or rigged sports games. According to some internal sources interviewed by the Guardian, the main problem is time. Moderators have a maximum of 10 seconds to make a decision that can be of international importance and generate real cases. The social network has grown so fast that it is almost impossible to keep control of it. One of the main problems then, according to sources inside the social, are the cases related to the topic of sex. The most growing phenomenon related to this topic is revenge porn, that is the revenge of an ex who posts on the social network a hot photo of the previous partner. This is a phenomenon that must be managed immediately and can have tragic effects, as has already happened on several occasions.
What not to post ever
The most important role of censorship Facebook must implement it, however, in the field of fake news. Fake news can foment hatred and violence and above all misinform users. One content that will always be deleted is the one concerning a shooting that also involves US President Trump. The insult to the political office therefore is not welcome in all its forms. Among the content that absolutely cannot be posted there are also videos of violent deaths. Will be deleted then all photos that report physical abuse, not sexual, and acts of child bullying. Photos of animal abuse can only be shared for awareness campaigns. Works of art that show nudity or explicit sex scenes are allowed, while amateur photos and videos with content prohibited to minors are almost always censored.
The most sensitive topics
Videos about abortions are allowed, as long as they do not show nudity. Facebook also doesn't block videos of people doing live feeds on the social in which they intentionally injure themselves or cause themselves pain. The social stated, "We don't want to censor people in trouble, we want to help them." Any person with more than 100,000 followers is considered a "public figure." This goes to affect the internal rights of the social network. A popular person has less protection than a user with fewer followers.
Facebook and violence
Some posts of violence are still accepted, however. This is because Facebook has stated that people often use the platform as an outlet for some of the frustrations of daily life. And it wouldn't be right to censor any form of insult, or any message containing violence. It's not clear, however, why you can't write "Shoot Trump" but you can write "Kick people with red hair" or "Beat up fat kids". This argument also applies to the violence present in the videos. Someone, within Facebook, has thought of a preventive method especially for minors. That is to report some content, videos or posts, with a warning message. Of the series: "Warning, content not suitable for a minor audience". It could be a first solution.