© AP

Facebook is facing fresh scrutiny over the way it moderates content on its platform after a newspaper investigation revealed guidelines for staff on how to handle posts featuring issues such as violence, hate speech, child abuse and pornography.

The training manuals, which were published by the Guardian on Monday, reveal how the social media group’s 4,500 global moderators judge when to remove or allow offensive content.

They show how posts that threaten to kill Donald Trump, US president, are banned because heads of state are considered “vulnerable”, but violent threats against others are permitted.

For example, the manuals guide moderators that it is permissible to allow users to post messages such as “to snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”, on the grounds that the company does not consider them “credible” threats.

The manuals also show how the platform, which now has 1.94bn users, will allow livestreaming of attempts to self harm because Facebook “doesn’t want to censor or punish people in distress who are attempting suicide”.

Facebook said safety organisations had advised the social media group that leaving posts of this nature online allowed people to seek help.

The revelations come as the group is under mounting pressure from politicians and campaigners who argue that it should take more responsibility for the content that appears on its website.

Last month, Facebook removed a video of a man in Thailand who murdered his baby daughter before killing himself. Two days later, another man in the US livestreamed his suicide by gunshot.

The guidelines only come into play after users flag potentially offensive content to Facebook. The company says it already has automated systems in place to stop the publication of certain types of material such as child sex abuse and terrorism.

Facebook confirmed the authenticity of the manuals, some of which were reproduced by the Guardian, but added that some of the material was out of date.

The company said the manuals were drawn up by a group of “highly trained people” with regular advice and input from external groups such as safety campaigners and non-governmental organisations. It added that the material was under constant review with the group holding weekly meetings.

Monica Bickert, Facebook’s head of global policy management, admitted that there were “grey areas” in policing content on its website.

“For instance the line between satire and humour and inappropriate content is sometimes very grey,” Ms Bickert told the Guardian. “It’s very difficult to decide whether some things belong on the site or not.”

In the UK, politicians have been stepping up the pressure on the social network. Last week, the Conservative party manifesto set out plans to reform the way technology and social media groups operate.

“We want social media companies to do more to help redress the balance and will take action to make sure they do,” said Theresa May, prime minister.

A report by the select committee for culture, media and sport said earlier this month that “the biggest and richest social media companies are shamefully far from taking sufficient action to tackle illegal or dangerous content”.

Charlie Beckett, director of Polis, a media think-tank based at the London School of Economics, said he was reassured by the leaks for suggesting that Facebook was taking the issue of content on its platform “seriously”.

But he added: “Facebook is making taste decisions. What we may find offensive in the UK is likely to be very different to what people in Saudi Arabia find offensive. I’m concerned that freedom of expression campaigners will say even if it’s offensive, why not have it out there?”

Get alerts on Facebook Inc when a new story is published

Copyright The Financial Times Limited 2020. All rights reserved.
Reuse this content (opens in new window)

Commenting on this article is temporarily unavailable while we migrate to our new comments system.

Note that this only affects articles published before 28th October 2019.

Follow the topics in this article