Facebook’s internal rules for abusive posts and hate speech

It must be a huge feat to have to moderate over a billion users to make sure they are obeying the rules of Facebook. The staff at Facebook have to review millions of reports every week in over 40 languages.  The Guardian got a hold of some leaked Facebook policies used to moderate the social networking app and here are our top considerations and rules from the 100-page Facebook rule book. The social networking site’s rights and responsibilities is quite lengthy for users to digest, here are some summaries.

Revenge Porn

Revenge Porn happens when intimate images are shared online after a relationship ends. According to the document obtained by The Guardian, Facebook users reported almost 54,000 incidents of sexual extortion and revenge porn in January 2017. The company has disabled 14,130 accounts as a result. Moderators escalated 33 cases for involving children. Revenge Porn according to the training manual is any image produced in a private setting, the person in the image is nude, near-nude, sexually active and there is lack of consent (confirmed by vengeful context like caption or comments or independent sources like media coverage). People can report revenge porn and Facebook will remove the image and they will use “photo-matching technologies” to automatically detect if the offending image has been shared elsewhere.

Credible Violence

Despite a lot of criticism after it launched the live platform, Facebook stipulates that it will not delete videos and images depicting violence, self-harm and child abuse of a non-sexual nature because they draw attention to mental illness and are newsworthy. Some footage shows physical bullying of children under seven. Graphic violence depicting animals and marked as disturbing will be taken down. Their restrictions on sadism and celebration of violence also apply to images/videos of animal abuse.

Abusive posts

In the last few months, Facebook has started to ban abusive posts about disabled people and those with serious health conditions.

Terrorism and hate groups

One of most difficult things to navigate for a big social network like Facebook is acts and posts that glorify terrorism, fundamentalist extremists and organisations that promote acts of violence. Facebook’s policy is to take down those kind of posts. If the posts have commentary condemning these acts then they can stay up according to their rule book.

Violent phrases and threats

Many people post threats on a platform like Facebook. While some threats will be taken seriously and taken down, violent phrases about how to “snap a b—-’s neck” are allowed because they are not regarded as credible. Threats against particular people will be dealt with accordingly. Threats against specific people especially senior politicians, heads of state, witnesses and journalists are “protected” and will be taken seriously, escalated or removed.


Facebook will not tolerate any form of nudity. The company often draws the line at sexual images or those featuring nudity. They will allow artwork from the real world featuring nudity but digitally created nudity is not allowed. They previously had issues with pictures depicting moms’ breastfeeding their children but have since allowed these posts after much criticism from the users.

Monika Bickert, Facebook’s head of global policy management told The Guardian they will be working tirelessly to ensure safety for their users and ensure free speech. She added they will be adding 3000 on top of the 4500 people who work within their community operations teams.

Over the last few weeks, we've seen people hurting themselves and others on Facebook — either live or in video posted…

Posted by Mark Zuckerberg on Wednesday, 3 May 2017


Caxton Central

Latest News


Recommended Story x
Google Translate adds Zulu and Xhosa to its system