The social network has pledged to work harder to identify and remove disturbing content – but doing so can take a psychological toll
Ever wanted to work for Facebook? Mark Zuckerberg has just announced 3,000 new jobs. The catch? You’ll have to review objectionable content on the platform, which has recently hosted live-streamed footage of murder, suicide and rape.
In his announcement, Zuckerberg revealed that the company already has 4,500 people around the world working in its “community operations team” and that the new hires help improve the review process, which has come under fire for both inappropriately censoring content and failing to remove extreme content quickly enough. Just last week the company left footage of a Thai man killing his 11-month-old daughter on Facebook Live on the platform for a whole day.
Instead of scrutinizing content before it’s uploaded, Facebook relies on users of the social network to report inappropriate content. Moderators then review reported posts – hundreds every shift – and remove them if they fall foul of Facebook’s community standards. Facebook does not allow nudity (including female, but not male, nipples), hate speech or glorified violence.