Site icon Netimperative

Facebook’s secret guidance on graphic content leaked

Facebook’s moderation policies on sex, violence and hate speech have been exposed in a newspaper report this week, after criticism over the social network’s handling of extreme content.

Leaked documents to the Guardian newspaper reveals how Facebook censors people’s content.

The UK newspaper said the manuals revealed the criteria used to judge if posts were too violent, sexual, racist, hateful or supported terrorism.

The Guardian said Facebook’s moderators were “overwhelmed” and had only seconds to decide if posts should stay.

Pictures and footage of violent deaths, abortions, the non-sexual abuse of children, and self-harm do not have to be deleted according to the leaked guidance for moderators.

More than 100 internal training manuals, spreadsheets and flowcharts revealing Facebook’s secret internal policies on upsetting material were obtained by The Guardian.

The documents show how internal policy at the social media giant compares with its testimonies before a number of government committees as it receives increasing attention for the content hosted on its platforms.

Facebook has been criticised for hosting terrorism-related material, as well as a number of violent broadcasts.

One source told The Guardian that “Facebook cannot keep control of its content. It has grown too big, too quickly.”

Mark Zuckerberg recently announced that over the next year Facebook would be adding 3,000 staff to its 4,500-strong content moderation team, following criticism over a series of graphic posts on the site.

The secret documents say such recordings of violent deaths must be marked as disturbing, but do not need to be deleted if they “can help create awareness of issues such as mental illness”, The Guardian reports.

According to the paper, Facebook advises that “remarks such as ‘Someone shoot Trump’ should be deleted, because as a head of state he is in a protected category.

Similarly, Facebook will allow the live-streaming of attempts of self-harm cause it “doesn’t want to censor or punish people in distress”.

They also advise that images of animal abuse can be shared “with only extremely upsetting imagery to be marked as ‘disturbing’,” The Guardian reports.

Monika Bickert, the head of global policy management at Facebook, said: “We work hard to make Facebook as safe as possible while enabling free speech.

“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”

Read the report here

Exit mobile version