It’s double, double and even more toil and trouble for social media companies this week. Ever since the Wall Street Journal’s damning publication of the Facebook Files, which drew on thousands of documents leaked by the whistle-blower and former employee Frances Haugen, regulators and lawmakers have been investigating whether companies like Facebook – or “Meta”, as it’s to be known – are intentionally choosing corporate gain over the public good, or as Haugen put it in her bombshell address to the US Senate, “astronomical profits before people.”
Haugen added further fuel to the fire in her testimony to the British Parliament on Monday, where she told politicians that the platform’s use of “engaged-based rankings” is dangerous as it “prioritises and amplifies divisive, polarising, extreme content.” Depressingly, her testimony chimes with a damning new report by the antiracist advocacy group HOPE not Hate reveals that these platforms are also failing to curtail the dissemination of antisemitic content online.
HOPE not Hate’s report, produced in collaboration with Expo Foundation and the Amadeu Antonio Foundation, offers a wide-ranging survey of online antisemitism across various online spaces. It explores two core questions: How antisemitism is being affected by the internet, and how different online spaces affect the nature of the antisemitism found within them.
“Antisemitism, which has never been a static phenomenon, has adapted in a number of ways in the digital age,” says David Lawrence, one of the researchers who compiled the scathing report. “For example, antisemites have become more adept at smuggling their messages into ostensibly non-antisemitic conspiracy theories, and in doing so are reaching large new audiences.”
As predicted, the report discovered that despite ten years of attempts by social media companies to regulate and moderate hate speech, antisemitism runs rife across every platform. The findings reveal how antisemitism is most commonly and widely spread through conspiracy theories, which grew exponentially during the Covid-19 pandemic.
“The pandemic has been enormously disruptive,” says Lawrence. “Conspiracy theories can provide frameworks for understanding chaotic events, as well as scapegoats. Antisemitism is a long-established conspiratorial tradition, and antisemitic tropes are rarely removed from a diverse array of conspiratorial notions, including those revolving around Covid-19.
“The boom in such theories has therefore provided new potential slip roads toward antisemitism. Of course, increased time spent on social media during lockdown will also have increased the likelihood of encountering such notions.”
As Lawrence highlights, extended periods of economic uncertainty and social isolation fuel the rise of anti-elite and anti-establishment narratives in times of crises. Just take the dawn of the pandemic as an example, where we saw people fall prey to theories that an evil Bill Gates was implanting trackable microchips into vaccines or that Covid-19 was spread through 5G technology. Hence these “slip roads” provided yet another route into encountering vile antisemitism.
To further explore this, Lawrence, among other researchers, delved into nine social media platforms and websites. This included mainstream platforms like Facebook, Twitter and TikTok as well as lesser-known and alternative websites which are popular amongst conspiracists like Parler and 4Chan’s pol/board forum and Telegram.
One of the worst offenders was the messaging app Telegram, which in July 2021 hit 550 million monthly active users, making it the fifth-most-used messaging app in the world. The encrypted app is primarily used for its group-based chat rooms, which are occasionally used by whistle-blowers and journalists owing to the apps’ strong emphasis on privacy. In fact, the app has won praise for its resistance to censorship in helping protesters communicate from Belarus to Myanmar. But, like all these sites, there is a dark corner where a cesspool of antisemitism continues to fester.
Telegram has now become a “safe haven” for antisemites and extremists that other social networks have blocked. This includes believers and peddlers of QAnon, the theory that the world is ruled by an elite cabal of paedophilic politicians, actors, and financiers who abduct children to harvest their blood for eternal youth. This is a ludicrous belief firmly rooted in the old antisemitic idea of a “blood libel” that falsely accuses Jews of murdering Christian children to use their blood for religious rituals, like adding it to bake matzos for Passover.
The researchers from HOPE not Hate claim that several other channels devoted to antisemitic conspiracies have grown dramatically since the pandemic. In addition to traditional far-right content, the majority of the antisemitism on Telegram is expressed through coded terms and interwoven with conspiratorial notions around the pandemic, vaccines, an alleged “Deep State” and a supposed cabal of elite paedophiles. As mentioned, a number of these groups have gained considerable followings, such as “Dismantling the Cabal” which has grown to 90,000 subscribers since its founding in February 2021 and publishes clips from anti-lockdown demonstrations and allusions to a New World Order with Holocaust denial and white nationalist propaganda.
Telegram has also now become a central hub for QAnon internationally, especially following the movement’s exodus from more popular social media in the aftermath of the storming of the Capitol Building in January. Another channel, run by an antisemitic QAnon follower called GhostEzra, has gained a following of 333,000.
His forum has been dubbed the “largest antisemitic internet forum” globally, where he promotes the notion that “Zionists” are poisoning populations through vaccines and that Jewish people are “destroying the world” and working toward “total domination and control.” As is typical with many antisemites post-pandemic, GhostEzra blends Covid-19 conspiracy ideologies into his outbursts, claiming that “more human beings globally will die at the hands of the Zionist created vaccines than Jews died during WWII.”
In the report, Lawrence writes that the sheer size of the channel means that the GhostEzra has undoubtedly introduced antisemitism to new audiences, and continues to do so without any intervention from Telegram.
Other companies have taken action but still fail to identify this harmful content. In August 2020, Facebook and Instagram announced that they had updated their Community Standards guidelines to include a Tier 1 prohibition (representing the most serious violation of guidelines) of allegations of “Jewish people running the world or controlling major institutions such as media networks, the economy, or the government.” And yet, it is still easy to find antisemitic conspiracy content on the platform. Primarily through codewords that are designed to avoid online content moderation and undetectable for most, like the deliberate misspelling of “Jews” to “Jooz/Joooz” or through using the “nose” emoji.
The HOPE not Hate report arrived at the clear-cut conclusion that the amount of overt and extreme antisemitism on a platform was closely linked to the amount of moderation on these platforms. The more lax the moderation, the more it was likely to platform hateful content.
To combat this rise in conspiratorial antisemitism, the report suggests a two-pronged approach that includes a more dedicated effort by civil society and tech companies. It recommends that NGOs need to deepen their understanding of how antisemitism spreads and its impact, that there needs to be greater quantities of more comparable data, and that there should be long-term legislative activity and more scalable, accurate methods of different approaches to measuring kinds of antisemitism online.
“There are also many steps tech companies can take to better combat the spread of antisemitism,” says Lawrence. “Explicitly banning antisemitism in community guidelines is the first simple step, one which has still not been taken by all platforms. Moderators also need more training to keep up to date with the ever-changing expressions of antisemitism. A more proactive effort to deplatform anti-Semites would also reduce the spread of the prejudice, and better reporting systems are needed.
“The hope is to put increasing pressure on tech companies to take online antisemitism as a serious issue. Much more can – and should – be done in order to limit the spread of this dangerous form of hatred.”
From the spread of hate speech to misinformation about Covid-19 to the dissemination of antisemitic theories, social media companies have prioritised cash over a duty of care. In a week that sees MPs and Peers scrutinise the draft Online Safety Bill, pressure is now mounting for these companies to become as transparent and accountable as possible when it comes to content moderation. They (and we) will be all the better for it.