The Oversight Board's decision on Palestine is a step in the right direction, now Facebook needs to step up


We, the undersigned organizations, welcome the recent decision by Facebook’s Oversight Board on Facebook’s unjustified removal of news content related to the recent bout of violence in Palestine and Israel under its Community Standard on Dangerous Individuals and Organizations (DIO).

On May 10, 2021, a Facebook user in Egypt shared a news item from Al Jazeera’s verified Arabic page which contained a threat of violence by the spokesperson of the Qassam Brigades, the military wing of the Palestinian political faction Hamas. Facebook initially removed the content after being reviewed by two content moderators for violating its Community Standard. Both the Qassam Brigades and its spokesperson are designated as dangerous by Facebook. 


Restricting freedom of speech


As the Oversight Board rightly concluded in its review, Facebook’s content removal was an unjustified restriction of freedom of expression on a subject of public interest. The Board further noted that the content removal was not necessary since it did not reduce real-world harm, which is the aim of the DIO policy. 


While Facebook restored the content, the case is emblematic of Facebook’s systematic arbitrary and non-transparent overenforcement of this Community Standard, particularly in relation to Arab and Muslim communities often at the detriment of users’ freedom of expression, and their freedom to seek, receive, and impart information. During the period of May 6 - 19, 2021, Instagram removed or restricted at least 250 pieces of content related to Palestine and #SaveSheikhJarrah campaign, while Facebook removed at least 179. The reported cases are only the tip of the iceberg with speculation these numbers reach the thousands.


Arbitrary and non-transparent enforcement


Furthermore, the case details shared by the Board raises a number of serious concerns.


Firstly, Facebook restored the content only after the Board declared its intention to review the complaint lodged by the user. Facebook stated it had mistakenly removed the content but failed to answer the Board’s request to explain the reasons why content reviewers during the manual review rated the content as a violation of the DIO Policy. The decision to remove, then restore, this content is a demonstration of Facebook’s arbitrary and non-transparent enforcement of its content moderation policies, which is a widely shared grievance among journalists, activists, and human rights defenders in the Middle East and North Africa (MENA) region. 


Secondly, according to the Board, the content was first reviewed and rated by a moderator in North Africa. It was then re-reviewed by another moderator based in Southeast Asia following the user’s objection. The second reviewer did not speak Arabic and had access only to automated translation of the content. We recall that this is a systemic problem as civil society organizations have repeatedly urged Facebook to invest in the necessary local and regional expertise to develop and implement context-based content moderation decisions aligned with human rights in the MENA region. A bare minimum in this case would have been to hire content moderators who have adequate Arabic language skills and can understand regional context and nuances.


Thirdly, it is deeply worrying that Facebook not only removed the piece of content but also restricted the user’s account, allowing him read-only access for three days. It also restricted the user’s ability to broadcast live stream content and use advertising products on the platform for 30 days. Such disproportionate responses have been reported by many users in that period which make Facebook guilty of  suppression of speech.




We therefore support the Board’s recommendations and call on Facebook once again to:








Access Now




INSMnetwork — Iraq




Electronic Frontier Foundation



Join our mailing list

Stay up to date with our latest activities, news, and publications