Publications
Systematic Efforts to Silence Palestinian Content On Social Media

2020-06-07

This position paper was prepared by the 7amleh - the Arab Center for Social Media Development as part of 7amleh's advocacy work, focused on defending Palestinian digital rights. It is a part of a series of position papers examining the effects of the policies and practices of governments and companies on Palestinian digital rights.

Systematic Efforts to Silence Palestinian Content On Social Media

Prepared by: Mona Shtaya - 7amleh Advisory Council Member

Download the PDF here

 

Introduction

Beginning in 2015, following the Palestinian popular uprising, large swaths of Palestinian content started to disappear from social networks. These discriminatory takedowns, and the increase in such takedowns that has continued annually since this period, is a result of changes in the policies and practices of Israel and social media companies, primarily Facebook, which is the most popular social media platform amongst Palestinians.

It is well documented that for decades Israel has been working to shrink the space for freedom of expression, in particular for activists, human rights defenders and human rights organizations, who are working to hold Israel accountable for its human rights violations, and campaigning for Palestinian rights. In recent years these efforts have moved online and have included systematic efforts to pressure social media companies, in particular Facebook, to remove Palestinian content. In addition, the definition of hate speech has been expanding. Since the 1970s, Israel has been working to shrink the space for freedom of expression by expanding the definition of anti-Semitism to include any criticism of Israel.1 In recent years, with the growth of online advocacy efforts, Israel has also been working to equate criticism of Israel, anti-Zionism and campaigning for Palestinian rights with hate speech and incitement to violence and terrorism. In addition, Israel has exploited the general growth of the counterterrorism field following 9/11, and more recently the global rush to “eradicate” so-called “terrorist and violent extremist content” (TVEC) following the Christchurch massacre, to limit freedom of expression about Palestinian rights issues online. These efforts have led to the removal of hundreds of thousands2 -- and perhaps even millions - of content documenting protests, uprising, and human rights violations of Palestinians under the guise of ‘hate speech.3

For years the exact numbers of takedowns have been unclear. However, recently, as a result of a request for information issued in accordance with Israel’s access to information law, the Israeli government stated that from 2017 - 2018 Israel’s direct requests to social media companies led to the deletion of 27,000 posts from Facebook, Twitter and Google.4 Adalah - The Legal Center for Arab Minorities Rights in Israel reported that the Israeli Ministry of Justice made tens of thousands of requests to content intermediaries like Facebook and Google to censor the Palestinian narrative.5 While these efforts are not exclusively focused on Palestinian content, the Israeli Minister of Justice, Ayelet Shaked stated that “Facebook, Google, and YouTube are complying with up to 95% of Israel’s requests to delete content that the Israeli government says incites Palestinian violence.” This shows a significant focus on Palestinian content and efforts to label Palestinian political speech as incitement to violence.6

This position paper seeks to outline how systematic campaigns by the Israeli government, Israeli government-supported NGOs and online trolls are leading to violations of Palestinian’s right to freedom of expression and the right to assembly on Facebook. The paper begins with a discussion of the obligations of Israel and Facebook to respect Palestinians human rights and digital rights and then shows how Israel has developed tactics and strategies that seek to further shrink Palestinian space, as well as express one of the key methods used to silence Palestinians online and further Israel’s unlawful political aims. The paper also shows how Facebook is being not only used by Israel, but how the company policies and practices are leading to further violations of Palestinian digital rights. Lastly, the paper concludes with recommendations about how companies, third party states and local and international civil society can contribute to upholding human rights online and protecting Palestinian digital rights.

Obligations to Protect Palestinian Digital Rights

States and companies play an increasingly important role in the global promotion and local implementation of human rights standards. States have obligations to respect, protect, and fulfill the human rights of all without discrimination, which includes ensuring that companies operating in their territories comply with the UN Guiding Principles on Business and Human Rights.7 This is particularly challenging as local laws vary in their compliance with international norms, and in particular in cases where the legal system is designed and instrumentalized to further political goals that violate human rights online and offline. In the case of Palestinians, there is a highly complex contradiction in the extent to which they can access and express their rights under the multiple local frameworks which include Israel, as the occupying power, the Palestinian Authority, and the de facto government of Hamas. As a United States based company working in Israel and the occupied Palestinian territories, Facebook must be committed to ensuring that their content moderation policies and practices do not further the violation of Palestinian rights and put in place safeguards to ensure their rights are respected on their platforms. The right to freedom of expression and the right to freedom of assembly needs to be considered not only through International Human Rights Law, the parameters of International Humanitarian Law and the Law of Occupation, but also through an understanding of the unlawful political aims of Israel, which are being expanded both online and offline. It also must be considered that Palestinian’s rights are also extremely vulnerable because of the state of emergency that has been declared by Israel since 1948, and the more recent state of emergencies in response to the coronavirus enacted by Israel and the Palestinian Authority. This has created an enabling environment for further rights violations both online and offline.

There are varying interpretations and many are inconsistent with international human rights law. For instance, laws against “extremism” which leave the key term like “terrorists” undefined provide discretion to authorities to pressure companies to remove content on questionable grounds. Similarly, unclear definitions of “hate speech” often result in criminalization of legitimate human rights speech. As companies are often also under pressure to comply with state laws that criminalize content that is said to be, for instance, blasphemous, critical of the government, defamatory of public officials or false, this requires a higher level of attention to be paid to the context, the rights of the people and their responsibilities as a company.

Development of Israeli Tactics and Strategies

In recent years the Israeli government has developed tactics and strategies aimed at shrinking the space for Palestinian freedom of expression and assembly online, and more generally the space for expression about Palestinian rights and human rights violations of Israel. This includes the development of governmental and non-governmental apparatuses and the recruitment of online armies of trolls to carry out both overt and covert online operations aimed at taking down Palestinian content, delegitimizing Palestinian advocacy efforts, and spreading misinformation. This systematic, international effort to silence Palestinians is a method of Israel’s unlawful occupation and designed to further its political aims, which are contrary to international law. How it supports the development of discriminatory content moderation policies and governance must also be well understood by social media platforms and third-party states alike.

Since 2015, Israel has been developing new ministries and special units that report Palestinian content to social media companies. In 2015, the Israeli Ministry of Justice developed a special ‘Cyber Unit’ to support Israel’s National Cyber Crime Unit (Lahav 433) and the Israeli Law, Information and Technology Authority at the Ministry of Justice. 8 The Cyber Unit is also responsible for making requests -- based on the alleged violations of domestic laws, as well as the companies own guidelines, terms and standards -- to tech companies like Facebook and Google. Even though Adalah and ACRI argued that the Cyber Unit cannot submit “voluntary” requests to bypass constitutional and administrative norms, including transparency and due process,9 these processes continue and as a result of Israeli efforts, large amounts of Palestinian content has been taken down and severe limitations on freedom of expression and opinion have been imposed by Facebook and other social media companies.

Israel and a number of non-governmental and governmental organizations, are also encouraging citizens and supporters to join coordinated efforts to report Palestinian content and have it removed from social networks. Several of these organizations -- dubbed “GONGOs” (government-operated NGOs) -- are working to conflate criticism of Israel and anti-Zionism with anti-Semitism and hate speech10 and have designed strategies to manipulate social media algorithms with the support of online trolls.11 Their work includes both efforts to take down content critical of Israel and supporting Palestinian human rights, as well as working to promote content intended to smear Palestinians that includes disinformation, incitement and hate speech directed towards Palestinians. One of the first trolling groups was started by the Interdisciplinary Center (IDC) and the Israeli American Council (an American NGO that is backed by the settler supporting mega-donor Sheldon Adelson), ACT.IL. Tested during Israel’s 2012 and 2014 attacks on Gaza, which resulted in thousands of civilian deaths, ACT.IL was designed to coordinate groups of online trolls to report and share content that includes disinformation and hate speech directed towards Palestinians. Today the online platform includes 15,000 active members and has offices in three countries.12 In addition, the Ministry of Strategic Affairs, developed a similar application, 4IL.org.il, in June 2017. These trolls are instructed to rally against Palestinian content and report it for takedowns, to spread misleading or even at times misinformation, or to smear human rights defenders, organizations and activists.

In one case, the Israeli Ministry of Strategic Affairs and 4IL.org, tried to make the argument that several leaders of Palestinian human rights organizations are terrorists, or affiliated with terrorist organizations through a campaign entitled #TerroristsInSuits. While the campaign is intended to silence human rights advocacy work, it can also potentially lead to incitement of violence against activists, human rights defenders and organizations advocating for Palestinian rights as well as create a ‘Chilling Effect’ (spread of self-censorship) and shrink the space for freedom of expression about human rights. Research into these claims by journalists, human rights defenders, diplomats and academics have revealed the report and campaign using a misleading title and images.13 In October 2019, the Report of the Special Rapporteur on the situation of human rights in the Palestinian territories occupied since 1967, noted with particular concern, “[…] the harmful practices employed by the political leadership and state authorities in Israel to silence human rights defenders’ criticism of certain government policies. […]”14. In addition, Israel, GONGOs and armies of trolls were instructed to like, share and comment on the content in order to increase its visibility on social media networks. Through this increased engagement, the comments and initial posts became more visible on Facebook. Many of these comments are racist against Palestinians, but have yet to be taken down. In this way the government, organizations and trolls are able to create a more oppressive environment online for Palestinians and contribute to shrinking the space for Palestinian freedom of expression, assembly and association on social networks.

Israeli politicians have also been working to expand their control over the space for Palestinian freedom of expression online by drafting social media-specific laws and legislations. In 2017, the Israeli government developed “the Facebook Bill” to pressure social media companies to comply with Israel’s broad definition of “incitement” which contradicts international human rights laws (principles of necessity, proportionality, and legitimacy). The bill would have granted Israeli administrative courts the power to block content that amounts to online “incitement” at the request of the government. The bill would have also issued orders to delete content “if it harmed the human safety, public, economic, state or vital infrastructure safety” as well as pave the way for legal actions against social media companies who disseminate such content including Facebook, Twitter, YouTube, and Google.15 The legal consequences included hefty fines and may even lead to banning these platforms from working in the country. According to Simon Milner, the Vice President of Facebook policy, APAC, Facebook is concerned that the bill, if passed, would allow courts to decide on the matters presented before them by the government on an ex-parte basis, without a requirement to hear the other party. Milner stated that Facebook suggested the bill get looked at again. He, however, said that Facebook is already working closely with Israel’s cyber-crime unit to take down the majority of what the unit refers to them.

Facebook Content Moderation Policies and Practices

As the most popular platform for Palestinians, Facebook’s decisions regarding content moderation can dramatically impact Palestinians capacity to exercise their rights online, in particular their freedom of expression and opinion, assembly and association and access to information online. Taking down Palestinian content does not only have a negative impact for individuals, but for the Palestinian community as a whole, who heavily relies on Facebook and other social media channels as a way to advocate for the protection of their human rights. In the context of unlawful occupation, Palestinians are particularly vulnerable to more powerful states and companies whose policies and practices can lead to silencing that makes their community further marginalized or vulnerable.

Policy Development and Implementation

Content moderation policies are designed and implemented in varying ways across different social media companies. In the case of Facebook, the company has developed Community Standards, “the rules that determine what content stays up and what comes down on Facebook”16 that are enforced using a combination of human review and artificial intelligence (AI). These standards have raised criticism from human rights and digital rights experts who have noted that these standards are often not in line with human rights norms and standards. In response, in April 2018, Facebook published its internal guidelines17 used to enforce these standards. These guidelines were designed to reduce subjectivity and ensure that decisions made by reviewers were as consistent as possible.

Facebook’s policy process includes engagement with stakeholders to “understand the different perspectives that exist on free expression and safety, as well as the impacts of the company policies on different communities globally,”18 for many years Palestinian civil society was not a part of these consultations. In 2018, to respond to increasing violations being committed using Facebook’s platforms, the company expanded its staff to include more employees that can address human rights and policy-related issues, including staff dedicated to engaging with the Palestinian government and civil society. While this increased engagement is a positive development, there are still many human rights and digital rights violations being committed both on the company’s platforms and as a result of Facebook’s policies and practices that have yet to be resolved or redressed. Lastly, even though Facebook publishes the policy meeting notes publicly19, there is little information about the stakeholders that are consulted. However, it is known that both Israel and several GONGOs are consulted as a part of the policy development process.

Use of Artificial Intelligence to Implement Content Moderation Policies

According to the policies of social media companies, content moderation policies are carried out with a combination of artificial intelligence (AI) and human moderation (which includes responses to user reports). While information about company policies are available online with varying detail, information about the use of AI -- and more specifically the words and images used to train AI systems -- is not public. At Facebook, the company decided to implement its artificial intelligence “with a strategy called ‘remove, reduce and inform”20 whereby the company removes, demotes and adds warnings to content that violates its terms of service. The company uses Neuro-Linguistic Programming (NLP) to create a common digital language for translation, which helps to expose the content that violates their policy. Panoptic FPN also helped the AI ​​systems to understand the context of the backgrounds of the images. According to Facebook’s policy, content which includes “nudity, violence, child pornography and terrorism” is automatically removed.21 However, the definitions of nudity,22 violence23 and terrorism in particular are controversial. Other content, for example, content that is potentially misinformation, is reduced, and lastly, some content is labeled as to inform the viewer of its violent or sensitive nature. Similar technologies and strategies are being used by other companies like YouTube and Twitter to implement their product policies, which may cause digital discrimination. Despite the fact that social media companies are using AI to enforce their terms of service -- and in the best cases to ensure that their policies and practices comply with human rights -- artificial intelligence is still highly erroneous and large swaths of Palestinian content has been taken down as the AI system has been put to increasing use.

For these reasons, among others, the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression has called for greater transparency and accountability in content moderation decisions,24 as have a number of civil society organizations. Perhaps as a result, some companies, including dominant players such as Facebook, have begun to share more information about their internal procedures and expanded their Transparency Report, where they report the amount of prohibited content that appears on the platform, including fake accounts, spam, terrorist propaganda, child sexual exploitation, hate speech, bullying, nudity, violence, and the sale of regulated goods. For most categories the company also explains their views of the content, how many pieces the company took action on, how much was found before reported by users, how many decisions were appealed and how much was restored after there were mistakes in enforcement (either through appeals processes or other means). However, this report does not clarify neither the policy under which the content has been taken down, nor the source of the takedown -- as a result of AI enforcing policies, as a result of human content moderation by Facebook, or as a result of external reports submitted to the company. This is crucial information for monitoring the influence of the Israeli government on the company’s content moderation policies and practices and would enable digital rights defenders to more effectively advocate both locally and internationally for human rights and digital rights.

Content Governance

To respond to the growing issues related to content moderation, in 2018, Facebook announced that it would develop its content governance structure to include the Oversight Board, or what has been dubbed the “Facebook Supreme Court”. When the Oversight Board was finally announced, Mark Zuckerberg wrote that Facebook “sought input from both critics and supporters of Facebook,25 hosting a global consultation process of workshops and roundtables with more than 650 people in 88 different countries, that resulted in:

While there is a need for Facebook and social media companies to develop mechanisms that ensure that the policies and practices of the company are in line with human rights, the establishment of such a board has the potential to help, but may not provide sufficient clarity or other human rights safeguards for content moderation decisions. For example, the board could not satisfy requirements for transparency, proportionality, grievance, or remedy on its own. Complying with human rights principles related to the freedom of expression and preventing avoidable damage will require evaluating the company’s business incentives. This includes looking into revenue models, recommendation algorithms, advertising transparency, and other issues. As the UN Special Rapporteur on Freedom of Expression said in his article entitled “Thee Republic of Facebook, “companies are working for their own interests, not in the interest of the public”.26

Of particular concern for Palestinians has been the announcement from the Facebook Oversight Board that Emi Palmor, the former general director of the Ministry of Justice, would be among the first 20 members, and one of two representatives from the Middle East and North Africa region.27 Under Emi Palmor’s direction, the Israeli Cyber Unit petitioned Facebook to censor legitimate speech of human rights defenders and journalists because it was deemed politically undesirable. This is contrary to international human rights law standards and recommendations issued by the United Nations (UN) Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression. Furthermore, digital rights experts and activists argue that censorship must be extremely rare and well justified to protect freedom of speech and that companies should develop tools that “prevent or mitigate the human rights risks caused by national laws or demands inconsistent with international standards.”28 In addition, this is contrary to Facebook’s own Oversight Board Charter which states:, “Members must not have actual or perceived conflicts of interest that could compromise their independent judgment and decision-making” The selection of Emi Palmor raises concerns that Facebook’s Oversight Board’s selection may, in particular, lead to further restrictions on freedom of expression online and undermine Palestinians abilities to advocate for and exercise their human rights.

In a public statement29 after the announcement of Emi Palor’s selection, Palestinian civil society urged Facebook and its Oversight Board to consider the grave consequences that electing Emi Palmor may have particularly on Palestinian human rights defenders and on freedom of expression online in defense of Palestinian rights. In a meeting with Palestinian digital rights advocates, Facebook representatives clarified that there will be one regional representative assigned per case that the members are reviewing and that Emi Palmor would be sitting as a representative of the MENA region. As Emi Palmor has a track record in contributing to develop a way for the Israeli government to systematically violate Palestinian digital rights, this creates a clear conflict of interest.

Particular Policies that Impact Palestinian Digital Rights

Extremist Content

While extremism is a global issue, there is currently no globally recognized and accepted definition of terrorist organizations. Attempts to develop definitions have often led to the implementation of practices that are repressive both by authoritarian regimes and by states recognized as democracies. Research has shown how the field of counter-terrorism has not respected human rights.30 The United Nations Office of Counter-Terrorism notes “the shrinking space for human rights defenders and civil society actors to exercise their freedoms is a consequence of counter-terrorism measures that are not human rights compliant.”31

Several social media companies have developed policies about extremist content. Facebook, under their “Dangerous Organizations and Individuals Policy,” developed their own definition, which includes organizations or individuals involved in “terrorist activity, organized hate, mass murder (including attempts) or multiple murder, human trafficking and organized violence or criminal activity.”32 This content, as well as content that expresses support or praise for groups, leaders, or individuals involved in these activities, is removed by the company. However, Facebook’s policies related to extremism are also removing the word martyr (people who have been murdered by Israel) or some martyrs names and key political speech including the Arabic word muqawama (resistance). As neither the United States nor the Palestinian law criminalizes this kind of political speech, this shows how Facebook is going beyond their legal obligations and expanding their censorship of Palestinian content to include Israeli definitions, which are beyond minimum legal standards, further violating Palestinian’s human rights and digital rights.

These policies and practices in relation to extremism are not limited to Facebook, but are being shared with other social media companies working to take down so-called “extremist” content through the Global Internet Forum to Counter Terrorism (GIFCT). Initially founded by Facebook, Microsoft, YouTube, and Twitter today the GIFCT has grown to include increasing numbers of tech companies. Together they contribute to a database that includes over 200 million pieces of content. While GIFCT is most often criticized for lack of transparency, in addition, the GIFCT has supported a network of research institutions33 whose partners are known to support counter-terrorism narratives that equate criticism of Israel or support for Palestinian rights to calls for incitement to violence. This includes the Brookings Institution (United States), whose Saban Center for Middle East Policy was established by Haim Saban, a known pro-Israel supporter34 who uses his vast media network to spread pro-Israel propaganda.35 The institute is currently directed by Martin Indyk, a well known pro-Israel lobbyist and American diplomat who founded the Washington Institute for Near East Policy, a pro-Israel GONGO started by the America Israel Public Affairs Committee (AIPAC) and known to be a "part of the core" of the pro-Israeli lobby in the United States.36 The network also includes the International Institute for Counter-Terrorism (Israel), which is a project of the IDC Herzliya, and which most recently published a paper entitled, “The Virus of Hate: Far-Right Terrorism in Cyberspace” which attempts to draw connections between neo-Nazi’s, white supremacists and pro-Palestinian Facebook groups. Other members of the research network supported by GIFCT include the International Centre for Counter-Terrorism (Netherlands), Swansea University (UK), the Observer Research Foundation (India), the International Institute for Counter-Terrorism (Israel) and the Institute for Policy Analysis of Conflict (Indonesia).

Hate Speech

While the policies of social media companies that have been designed to respond to hate speech vary, Facebook has defined hate speech as a direct attack on people based on what we call protected characteristics – race, ethnicity, national origin, religious affiliation, sexual orientation, caste, sex, gender, gender identity and serious disease or disability.”37 While there have been attempts by Facebook to develop a lexicon of hate speech in Arabic, there has not been such a consultation held in regard to the definition of hate speech or terrorism with sufficient Palestinian stakeholders. This is particularly important, as an expanded definition of hate speech can be used to silence legitimate efforts for Palestinians and their supporters to advocate for human rights. For Example, in 2019 Facebook took down a post by a human rights advocate that stated that “Israeli settlers steal land” and included an attachment to a Haaretz article on the basis of the post being deemed “hate speech”. This shows how hate speech policies are being used to shrink the space for freedom of expression and issues regarding Palestinian human rights while protecting Israeli narratives which are contrary to international law and norms.

Facebook’s policies on hate speech must also be implemented in a non-discriminatory manner. In regards to policies regarding hate speech, researchers have found that large swaths of hate speech directed towards Palestinians and Arabs remain online. This includes data from 7amleh’s 2019 Index of Racism and Incitement which found that there is “One violent [public] post against Arabs and Palestinians in Hebrew every 64 seconds.”38 This content wasn’t removed by Facebook and was allowed to remain online, while large amounts of Palestinian content was removed under the pretext of hate speech and incitement. This kind of digital discrimination is a violation of the principles of equality and non-discrimination which are part of the foundations of the rule of law.

Conclusion and Recommendations

Respecting and committing to protecting Palestinian digital rights must be an integral part of the policies and practices of states and social media companies. As this paper shows, Palestinian digital rights and human rights are increasingly being violated online and offline by a systematic effort of Israel to silence Palestinians calling for their human rights and their supporters. Through Israeli ministries, special units, GONGOs and online trolls, Israel is carrying out their unlawful occupation of Palestinian space online and offline and seeking to achieve their political aims which are contrary to international law. These efforts in particular are impacting the definitions of “extremism,” “terrorism” and “hate speech” are resulting in social networks to adopt discriminatory definitions that are shrinking space for freedom of expression worldwide and violating Palestinians digital and human rights.

Social media companies, third party states and civil society must work to prevent harm to create an environment that is safe for Palestinians to express themselves online, assemble and associate freely and access information necessary for exercising their human rights. The recommendations in this paper are largely consistent with those recommended by the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, those contained in the Santa Clara Principles On Transparency and Accountability in Content Moderation39 and Access Now’s Recommendations On Content Moderation.40 They serve as an outline of the work that must be done by multiple stakeholders in order to ensure that Palestinian human rights are being upheld.

Social Media Companies

  1. Prevention of Harm: Companies should recognize that the authoritative global standard for ensuring freedom of expression is human rights law, not the varying laws of states or their own private interests.

    1. Facebook and other companies must develop policies and practices that protect Palestinian human rights and hold states and companies accountable for violations. Of particular concern are policies related to “extremism” and hate speech. To support this effort, it is essential that companies rely on international human rights law and international humanitarian law experts, as well as people with expertise in the case of Palestine and diverse representatives from the Palestinian community.

    2. Companies should work to ensure that they do not further the negative impacts of laws that violate human rights. In the case of Palestinian content in the occupied Palestinian territories, any regulation of Palestinian content should not be based on Israeli law.

    3. Companies must also fight advocacy efforts and legislations that are intended to censor freedom of expression and deny people their human rights and digital rights online. In particular, companies must fight Israeli legislation that is intended to silence freedom of expression of Palestinians and their rights supporters.

  2. Impact Evaluation: Companies shall review their policies and standards to comply with international law, especially in areas that are under unlawful occupation and in the case of attempting to carry out political aims that violate international law. They must also engage in rigorous evaluation of the impact of their existing products and policies on the human rights of users. In particular, Facebook must evaluate how Israel and its GONGOs and trolls are systematically using their platforms to violate Palestinian human rights and silence Palestinian human rights activists, defenders and organizations online.

  3. Community Consultation: Multiple stakeholders from the communities most impacted by those policies must be consulted in a serious, methodological and non-discriminatory manner and have their perspective proportionally integrated throughout the policy development and implementation process. While there have been attempts to engage Palestinian stakeholders by Facebook, a more serious approach should include ongoing engagement from the public, private and non-governmental sectors. Facebook must communicate directly with Palestinian authorities and Palestinian civil society, and ensure that their contribution is proportionally integrated into policy development and implementation processes, especially in regards to policies that are known to be instrumentalized by Israel.
  4. Non-Discrimination: Companies must develop ways to ensure that their content moderation processes do not discriminate against or contribute to further violation of the rights of people living under occupation.

    1. Palestinian content that includes words that are essential for political speech should not be removed or reduced from social media platforms as this is a violation of the right to freedom of expression and opinion, as well as the right to assembly and association.

    2. Policies must be implemented in a non-discriminatory manner and hate speech directed towards Palestinians, particularly the hate speech being spread by Israel, GONGOs and trolls, needs to be removed.

  5. Transparency: Companies must make transparent how policies are developed and implemented.

    1. Companies should not be consulting or cooperating with states in secret -- in particular with states who are known to systematically violate human rights.

    2. Information about content takedowns related to governmental requests must also reflect the policy under which this content was taken down so that communities can monitor how different policies impact content takedowns and ultimately digital rights.

    3. Companies need to make transparent the usage of AI to remove and reduce content, how these systems are used and the procedures behind their application. Systems should also be made available for independent auditing with a focus on human rights.

    4. Companies must give users the right to request a human review of their cases, with special attention paid to Palestinian content and content in Arabic, where AI has been known to erroneously take down large swaths of Palestinian content that is essential for protecting Palestinian’s human rights and digital rights.

  6. Accountability for Policies and Practices: Companies must hold themselves accountable for the impact of their policies and practices on human rights and should develop industry-wide accountability mechanisms and internal accountability mechanisms that ensure individuals have access to meaningful remedy, redress and human review upon request.

    1. Companies must be held accountable for nearly a decade of human rights violations that have resulted in well-documented consequences for the human rights of Palestinians and others worldwide.

Third Party States

  1. Laws should Respect Digital Rights: States should repeal any law that criminalizes or unduly restricts Palestinian digital rights and human rights online and offline.

    1. States should only seek to restrict content pursuant to an order by an independent and impartial judicial authority and in accordance with due process and standards of legality, necessity and legitimacy.

    2. States should refrain from imposing disproportionate sanctions, whether heavy fines or imprisonment, on social media companies, given their significant chilling effect on freedom of expression.

    3. States should refrain from establishing laws or arrangements that would require the “proactive” monitoring or filtering of content, which is both inconsistent with the right to privacy and likely to amount to pre-publication censorship.

  2. Independent Body Oversight: States should refrain from adopting models of regulation where government agencies, rather than judicial authorities, become the arbiters of lawful expression. They should avoid delegating responsibility to companies as adjudicators of content, which empowers corporate judgment over human rights values to the detriment of users.

  3. Requirement of Detailed Transparency Reports: States should publish detailed transparency reports on all content-related requests issued to intermediaries and involve genuine public input in all regulatory considerations.

    1. This includes information about content related requests to take down content outside of that states legal jurisdiction. This is particularly important as it seeks to reveal the efforts of states to shrink the space for freedom of expression globally. In the case of Palestine, this will also help to reveal Israel’s efforts to silence freedom of expression on Palestine globally, and enable states to respond with deeper insight and understanding of how Israel, and other states, are working to achieve unlawful political aims online.

  4. Countries that respect international and human rights laws must put pressure on the Israeli government and other governments that violate international and human rights laws and exercise digital discrimination against Palestinians to end this discrimination.

International and Local Civil Society

  1. Awareness-Raising: Inform the public of their rights and empower them with tools to protect themselves.Call for collective action to reject unlawful violations of digital rights and human rights, in particular efforts to censor Palestinian content online.

  2. Engagement: Palestinian civil society must actively work to engage with International (UN bodies) and Regional organizations (EU, NATO, African League, etc.) in order to bring about consensus around the legal framework to protect human rights threatened by systematic efforts to violate rights on social media platforms. In addition, civil society should seek to engage global organizations like the Global Internet Forum to Counter Terrorism (GIFCT) which seek to set standards for the policies and practices of companies.

  3. Ensure Accountability: Support legal action and apply pressure on companies and governments in order to ensure the protection of human rights. In particular, work with legal institutions that are working to protect Palestinian rights among others.

  4. Monitoring of Rights Violations: Palestinian civil society institutions and activists working in the field of digital and media rights shall intensify their efforts to monitor the violations of Palestinian digital rights on the Internet and report the violations by submitting reports to independent monitoring mechanisms and social media companies.

1 White. B. (2020). Delegitimizing Solidarity Israel Smears Palestine. Retrieved from: https://online.ucpress.edu/jps/article/49/2/65/107373/Delegitimizing-Solidarity-Israel-Smears-Palestine

2 Syrian Archive. (n.y.). Tech Advocacy. Retrieved from: https://syrianarchive.org/en/tech-advocacy

3 Kayali, D. (2020, January). Human Rights Defenders are Not Terrorists, and Their Content is Not Propaganda. Retrieved from: https://blog.witness.org/2020/01/human-rights-defenders-not-terrorists-content-not-propaganda/

4 Ibid

5 Adalah. (2019, December). Social Media Companies Continue to Collaborate with Israel’s Illegal Cyber Unit. Retrieved from: https://www.adalah.org/en/content/view/9652 https://www.adalah.org/en/content/view/9859

6 Reuters. (2016, September). Why Facebook and Google Are Complying with Israel To Delete Certain Content. Retrieved from: https://fortune.com/2016/09/12/facebook-google-israel-social-media/

7 Office of the High Commissioner on Human Rights (2011). Guiding Principles on Business and Human Rights. Retrieved from https://www.ohchr.org/Documents/Publications/GuidingPrinciplesBusinessHR_EN.pdf

8 The Office of the State Attorney. (n.y.): About the Cyber Unit. Retrieved from: https://www.gov.il/en/Departments/General/cyber-about

9 Adalah. (2019, November). Israel State Attorney claims censorship of social media content, following Cyber Unit requests, isn't an 'exercise of gov’t authority. Retrieved from: https://www.adalah.org/en/content/view/9859

10 Gurvitz, Y. (2014, April). What Is NGO Monitor's Connection to the Israeli Government? Retrieved from: https://972mag.com/what-is-ngo-monitors-connection-to-the-israeli-government/90239/

11 Ullah, A. (2018, November). Pro-Israel Activists Seek to Manipulatee Online Response to Gaza Violence. Retrieved from: https://www.middleeasteye.net/news/pro-israel-activists-seek-manipulate-online-response-gaza-violence

12 Winstanley, A. (2019, June 12). Inside Israel's million-dollar troll army. Retrieved from: https://electronicintifada.net/content/inside-israels-million-dollar-troll-army/27566

13 Israel labels BDS activists ‘Terrorists in Suits’. (2019, February). Retrieved from https://www.middleeastmonitor.com/20190204-israel-labels-bds-activists-terrorists-in-suits-in-new-smear-campaign/

14 Report of the Special Rapporteur on the situation of human rights in the Palestinian territories occupied since 1967: Situation of human rights in the Palestinian territories occupied since 1967. (2019, October 21). Retrieved from https://undocs.org/A/HRC/40/73

15 Guichman, R. (2018, July). The world is fighting terror and porn - Who is the Israeli Facebook law fighting? (in Hebrew). Retrieved from: https://www.themarker.com/technation/.premium-1.6270707

16 Zuckerberg, M. (2018, November, 18). A Blueprint for Content Governance and Enforcement. Retrieved from: https://www.facebook.com/notes/mark-zuckerberg/a-blueprint-for-content-governance-and-enforcement/10156443129621634/?hc_location=ufi

17 Facebook. (2018, April). Publishing Our Internal Enforcement Guidelines on Expanding Our Appeals Process. Retrieved from: https://newsroom.fb.com/news/2018/04/comprehensive-community-standards/

18 Ibid

19 Facebook. (2018, November) Product Policy Forum Minutes. Retrieved from: https://about.fb.com/news/2018/11/content-standards-forum-minutes/

20 Rosen, G. (2019, April). Remove, Reduce, Inform. Retrieved from: https://about.fb.com/news/2019/04/remove-reduce-inform-new-steps/#reduce

21 Facebook. (2019, September) Combating Hate and Extremism. Retrieved from: https://about.fb.com/news/2019/09/combating-hate-and-extremism/

22 Maheshwari, S., and Frenkel S. (2018, March). Facebook Lets Ads Bare Man’s Chest. A Woman’s Back Is Another. Retrieved from: https://www.nytimes.com/2018/03/01/business/media/facebook-ads-gender.html

23 Lapowsky, L. (2018, April). Wired. Here’s What Facebook Won’t Let You Post. Retrieved from: https://www.wired.com/story/heres-what-facebook-wont-let-you-post/

24 Human Rights Council (2018). Report of the Special Rapporteur on the promotion and protection of

the right to freedom of opinion and expression. Op.cit.

25 Harris, B. (2019, April). Getting Input on an Oversight Board. Retrieved from: https://about.fb.com/news/2019/04/input-on-an-oversight-board/

26 Kaye, D. (2020, May). The Republic of Facebook. Retrieved from: https://www.justsecurity.org/70035/the-republic-of-facebook/

27 Facebook. (2020, May). Welcoming the Oversight Board. Retrieved from: https://about.fb.com/news/2020/05/welcoming-the-oversight-board/

28Freedex. (2018). A Human Rights Approach To Content Moderation. Retrieved from: https://freedex.org/a-human-rights-approach-to-platform-content-regulation/

30 Human Rights Watch. (n.y.). Terrorism / Counterterrorism. Retrieved from: https://www.hrw.org/topic/terrorism-counterterrorism

31 United Nations Office of Counter Terrorism. (n.y.). Human Rights. Retrieved from: https://www.un.org/counterterrorism/human-rights

32 Facebook. Community Standards: Dangerous Individuals and Organizations. Retrieved from: https://www.facebook.com/communitystandards/dangerous_individuals_organizations

33 Global Internet Forum to Counter Terrorism. Retrieved from: https://gifct.org/partners/

35 Davis, C. (2012, Feb). Univision goes neoconservative. Retrieved from: https://www.aljazeera.com/indepth/opinion/2012/02/201221584750141923.html

36 Mearsheimer, J.; Walt, S. (2007). The Israel Lobby and US Foreign Policy. Macmillan. pp. 175–6

37 Facebook. (2019, September). Combating Hate and Extremism. Retrieved from: https://about.fb.com/news/2019/09/combating-hate-and-extremism/

 

Join our mailing list

Stay up to date with our latest activities, news, and publications