Leaks ‘expose peculiar Facebook moderation policy’ – BBC News May 22, 2017


 Facebook logoGetty Images

The guidelines Facebook uses to decide what users see are ‘confusing’ say staff

 

How Facebook censors what its users see has been revealed by internal documents, the Guardian newspaper says.

It said the manuals revealed the criteria used to judge if posts were too violent, sexual, racist, hateful or supported terrorism.

The Guardian said Facebook’s moderators were “overwhelmed” and had only seconds to decide if posts should stay.

The leak comes soon after British MPs said social media giants were “failing” to tackle toxic content.

Careful policing

The newspaper said it had managed to get hold of more than 100 manuals used internally at Facebook to educate moderators about what could, and could not, be posted on the site.

The social network has acknowledged that the documents seen by the newspaper were similar to what it used internally.

The manuals cover a vast array of sensitive subjects, including hate speech, revenge porn, self-harm, suicide, cannibalism and threats of violence.

Facebook moderators interviewed by the newspaper said the policies Facebook used to judge content were “inconsistent” and “peculiar”.

The decision-making process for judging whether content about sexual topics should stay or go were among the most “confusing”, they said.

The Open Rights Group, which campaigns on digital rights issues, said the report started to show how much influence Facebook could wield over its two billion users.

Article continues:

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s