Political speech and the “Great Replacement” conspiracy theory must be moderated by the same rules for everyone.
GPAHE is campaigning to persuade social media and tech companies that global democracies are at stake and they must apply their content community standards to all political and public figures’ speech. With our own eyes, we’ve seen the dangers of incendiary rhetoric from influential social media users in the form of January 6 and other violence. Our work exposing the dangers of harmful political speech and the dangers of the “Great Replacement” has informed the work of the Facebook Oversight Board which is considering the case of French presidential candidate Eric Zemmour and his hateful anti-immigrant conspiracy speech. This case will have global implications. A PDF of the comments is available.
Zemmour PDF Here
To: Facebook Oversight Board
From: Wendy Via and Heidi Beirich, Cofounders
Re: Politician’s comments on demographic changes
Date: December 12, 2023
We are writing from the Global Project Against Hate and Extremism (GPAHE), a nonprofit, civil society organization based in the US, to request that the Facebook Oversight Board recommend that Meta take additional and meaningful action to stop the dissemination of the dangerous, racist “Great Replacement” conspiracy theory on all Meta platforms. We are requesting a recommendation from the Board similar to that taken in the wake of the rise of the QAnon conspiracy theory and that Facebook recognize references, even oblique ones, to replacement as the hate speech and disinformation it is. We are also requesting that immediate action be taken to stop the proliferation of this conspiracy theory by political and public figures, who by Meta’s own admission, “often have broader influence” and “therefore, they may pose a greater risk of harm when they violate our policies.” GPAHE has documented many elements of the spread of the “Great Replacement” conspiracy theory and the political and public figures who spread it. Given the volatile and polarized global narrative around migration and its place in political discourse, it is especially imperative that this Board also recommend to Meta that it enforce its public figures and civil unrest policy.
The racist “Great Replacement” conspiracy theory inspires hate, violence, and mass murders. And nowhere does it thrive and spread more than in online spaces like Facebook and Instagram. As a technology company that impacts every corner of the globe, it is incumbent upon Meta to accept the responsibility that comes with great power and protect its users and our communities from blatantly false conspiracy theories, hate speech, and violence-inspiring disinformation before more damage is done.
The “Great Replacement” conspiracy theory is a global white supremacist concept that falsely claims white people are being replaced in their “home” countries by immigrants, Muslims, Black people, and other people of color. The conspiracy theory often blames the “elite,” “globalists,”
and Jews for orchestrating these changing demographics, which are perceived to be the cause of the disintegration of “traditional” and “national values.” The conspiracy theory is most often associated with the idea that immigration is meant to overwhelm the vote of those of European descent (white vote), thereby destroying their political power, erasing “traditional” and cultural “values,” which ultimately will “destroy” the country, in this case France. Those spreading the conspiracy theory often use demonizing language such as “ethnic substitution,” “invasion,” “overrun,” “colonize,” “remigration,” and “plague,” language that has been widely adopted by far-right media and political figures. Remigration refers to the voluntary or involuntary return of the majority of immigrants of non-European (white) descent, in effect an ethnic cleansing.
A May 2022 AP poll in the US found that one-third of those polled believed “an effort is afoot to replace native-born Americans with new immigrants for electoral purposes,” a 2023 UK poll found that one-third of Brits also believe in the conspiracy theory, and a 2021 poll found that 67 percent of the French believed the great replacement would happen. This is a dangerous global phenomenon. Dozens have been murdered by killers who believed the conspiracy theory, most of whom had a significant connection to online spheres, with at least one using Meta platforms to livestream the horrors.
Mass attacks related to the “Great Replacement” conspiracy theory
- 2023 Jacksonville, Florida – three Black people murdered.
- 2023 Allen, Texas – eight people murdered.
- 2022 Buffalo, New York – ten Black people murdered.
- 2021 Toronto, Canada – four members of a Muslim family were run down and killed.
- 2019 El Paso, Texas – gunman hoped to kill Latinos, killing 23 people.
- 2019 Poway, California – a synagogue was targeted because Jews were “planning a genocide” of Europeans. One person murdered. Killer had earlier set a mosque on fire.
- 2019 Christchurch, New Zealand – 51 people were murdered at mosques. Livestreamed on Facebook.
- 2019 Halle, Germany – gunman targeted a synagogue but was unable to enter. One passerby was murdered.
- 2018 Pittsburgh, Pennsylvania attack – targeted a synagogue, murdering 11 people.
- 2011 Oslo, Norway–77 killed, many of them teenagers. Gunman targeted individuals he thought would bring mass Muslim immigration into the country.
“Great Replacement” clearly meets Facebook’s community standard of being “tied to different forms of real world harm” and should be incorporated into its Dangerous Individuals and Organizations policy against militarized social movements and violence- inducing conspiracy networks like QAnon was, and the purveyors deplatformed, regardless of who they are. Facebook broadly banned QAnon in 2020, although they still have enforcement problems.
Public and political figures’ content moderation
There is no question but that Facebook community standards should be applied to political speech and rigorously enforced. All Meta platforms state that politicians and public figures are subject to the policies, but they also have “public interest” or “newsworthiness” exemptions, effectively rendering the rules, including those on hate speech and misinformation, useless, if indeed political speech is adequately reviewed for violations.
Hate speech has a measurable impact on people’s willingness and ability to participate in the democratic process. It can exert psychological constraints on members of a targeted group causing them to withdraw from public discourse, the so-called “silencing effect.” Moreover, hate speech causes desensitization, a loss of the ability to understand others’ pain, destroying a common basis for political communication. As history continues to show, hate speech coupled with disinformation can lead to stigmatization, discrimination and large-scale violence. Hate speech has been identified as a precursor to atrocity crimes, including genocide, such as the Rohingya genocide of 2017. And violence against LGBTQ+ people reached its highest point in the past decade in Europe and Central Asia in 2022 and the US against the backdrop of “rising and widespread hate speech from politicians, religious leaders, right-wing organizations and media pundits.” Hate speech from politicians and state officials was reported in 23 countries across Europe, as well as Azerbaijan.
As UN Secretary-General António Guterres has said, “Addressing hate speech does not mean limiting or prohibiting freedom of speech. It means keeping hate speech from escalating into something more dangerous, particularly incitement to discrimination, hostility, and violence.”
More than 70 countries comprising more than one billion people are expected to hold elections in 2024, an unprecedented number that includes some of the world’s biggest democracies, more fragile democracies, and some nations where there is a continued weakening of civil and human rights. More countries have moved away from democracy rather than toward it, a trend developed over the last several years, including in countries where democracy was thought to be firmly established. And since 2017, the number of countries moving toward authoritarianism are more than double those that are moving toward democracy.
Social media can be a positive and a negative for democracy. It can have a weakening effect on strong democracies and an intensifying effect on strong authoritarian regimes. Overall though, there is no doubt that the abuse of social media has had a negative impact on democracies worldwide. It is specifically important that politicians not use social media to spread hate speech. A body of research suggests the incendiary rhetoric of political leaders can make political violence more likely, gives violence direction, complicates the law enforcement response, and increases fear in vulnerable communities. Political leaders’ remarks do not disappear on social media, especially as the social media platforms’ algorithms tend to amplify more incendiary remarks, quickly magnifying rhetoric against their political opponents, minority groups, and other targets. Leaders with large social media followings will see their remarks shared with millions of followers. This then drives coverage in more traditional news outlets and serves as a cue to local politicians whose similar content is in turn amplified by their communities and the company algorithms. Politicians use social media in all the usual marketing ways for a campaign, but they are also able to bypass rules and norms of traditional media. And for those who wish to engage in hate and demonizing speech, the results can be damaging.
Given the equally unprecedented potential influence of social media platforms and those adept at manipulating the platforms, and the introduction of AI and its unknown effects, it is vital that the platforms prepare now to do all they can to protect democracies and elections around the world, especially moderating political figures’ speech the same as any other user and doing it in all languages.
Zemmour and demographic changes
The post in question for this appeal should have been taken down under the existing hate speech and misinformation policies. On its face, the post about the numbers of European and African people was clearly false and designed to instill fear about Africans in its readers. The post could have been removed under Facebook’s misinformation policy which states “misinformation will be removed where it is likely to directly contribute to the risk of imminent physical harm,” as has too many times been the case where the “Great Replacement” conspiracy theory is concerned.
Deciding the case on the word “colonization” was the wrong element to review. This post was not simply a “comment on or criticism of immigration policy.” The post clearly should have been reviewed based on Facebook protections of “race, ethnicity, national origin and religious affiliation” and the fact that “refugees, migrants, immigrants, and asylum seekers are protected against “the most severe attacks.” Zemmour’s post was obviously directed at his disdain for immigrants from Africa, a non-European continent, and one largely populated by Black and brown people, some of whom are Muslim. And if colonization was to be considered, it should have been understood to be a negative connotation about one population taking over or dominating another.
Facebook policies state that online and offline behavior is considered when reviewing a users’ posts and that is most critical when reviewing political speech which should be viewed with an even more discerning eye. Zemmour believes that non-white immigration is causing the native French population to be replaced by a Muslim majority. In other words, he explicitly endorses the “Great Replacement” conspiracy theory. He is openly anti-Muslim and is on record advocating that Ukrainian refugees should be allowed to obtain French visas, but those fleeing wars in Muslim-majority countries (specifically Arab states) should not. Zemmour has long held anti-immigrant and anti-Muslim views and has twice been convicted of incitement of religious hatred for statements he made in broadcasts during his time as a TV commentator. Some members of Zemmour’s Reconquête! party were also members of Génération Identitaire (Generation Identity), a hugely influential white nationalist and anti-Muslim movement founded in 2012, which has been banned by the French government. Additionally, Facebook banned the entire global Generation Identity network in 2019, an Identitarian network that almost exclusively pushes the “Great Replacement” conspiracy theory and has connections to violence. About the July 2023 riots in France after a young Arab boy was killed by police, Zemmour said, “with the arrival of massive numbers of migrants from the global South who are “so far removed from our cultural and civilizing canons,” the level of violence seen during the rioting was inevitable” and likened the riots to an “ethnic war.” Throughout his Facebook page, Zemmour references the great replacement, the term invasion, refers to Muslims and immigrants as violent and criminals, and other dehumanizing language. There are also links to other social media sites and to his official website which refers to the French Identity and the “evils of immigration and Islamization.”
- “Great Replacement” clearly meets Facebook’s community standard of being “tied to different forms of real world harm” and should be incorporated into its Dangerous Individuals and Organizations policy against militarized social movements and violence-inducing conspiracy networks like QAnon was, and the purveyors deplatformed, regardless of who they are.
- Remove “Great Replacement” conspiracy theory posts and accounts. In addition to the specific words “great replacement,” machine learning, algorithms, natural language processing, and other AI tools must also incorporate variations on the term, key words, hashtags, key phrases, patterns, and connected accounts. For example, Italian Prime Minister Giorgia Meloni uses the phrase “ethnic substitution.”
- Invest in non-English language resources to fully address the conspiracy theory content regardless of language.
- Redirect users to authoritative information when searching for replacement content.
- Limit the reach of content while under review for “Great Replacement” by downgrading in the promotion and search algorithm results, restricting the views, and flagging with disclaimers the posts and accounts.