No nudity on Facebook live thanks to artificial intelligence

Turn for the social: smart systems will evaluate if the content transmitted on the platform respects the policies of the community

A new algorithm to exclude obscene images from Facebook. The social network is experimenting with an artificial intelligence system that will automatically report the presence of offensive content.

As Joaquin Candela, the company's director of applied machine learning, says, "it's an algorithm that detects nudity, violence or anything that doesn't comply with our policies." So Facebook is relying on smart machines. Historically, the social network has always used user reports to detect such inappropriate content. Posts contrary to community standards are then deleted by some of the company's employees. Now with artificial intelligence came the breakthrough. The new smart system will be tested on Facebook Live, the service that allows users to broadcast live video on the social.

A controversial policy

Facebook has often come under fire because of its policies regarding offensive content. Some months ago, for example, the decision to censor the symbolic image of the Vietnam War caused such a general outrage that the social network had to backtrack. The picture had been posted by a Norwegian writer Tom Egeland to whom Facebook, after the publication of the image, had also blocked the account.

Back to the intelligent algorithm, Facebook would like to extend the system also for the detection of fake news. Especially after the American elections. The social network, in fact, has been accused of favoring the victory of the Republican candidate. Facebook has also been using automatic mechanisms for some time to analyze the reports that the social receives every week.