New Zealand Attack Underscores Social Media Sites’ Tolerance of Anti-Muslim Content

Tech giants like Facebook don’t always crack down on Islamophobia as much as other forms of online hate.

A police officer directs pedestrians near the site of one of the mass shootings at two mosques in Christchurch, New Zealand on Saturday. Mark Baker/AP

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

Nearly a year ago, Mark Zuckerberg testified before Congress that Facebook does not allow hate groups on its platform. “If there’s a group that—their primary purpose or—or a large part of what they do is spreading hate, we will ban them from the platform,” he told the House Energy and Commerce Committee on April 11, 2018.

Across the country in San Francisco, Madihha Ahussain was watching from her office at Muslim Advocates, a civil rights group that seeks to protect Muslim Americans from discrimination, bigotry, and violence. She found the assertion shocking. Since 2013, Muslim Advocates had been urging Facebook to remove Islamophobic hate groups and material. Ahussain tracks anti-Muslim bigotry, and Muslim Advocates had recently sent Facebook a list of 26 anti-Muslim groups on the platform. As Zuckerberg spoke, 23 remained active.

“We never received a response about the list that we sent,” Ahussain told Mother Jones in January. “We never received any indication about what they were doing to address those hate groups on the platform. So the reality is, they never really took any steps to address the presence of those hate groups on the platform.”

The massacre of 49 Muslims at two mosques in New Zealand on Friday was a devastating reminder to the entire world of the presence of anti-Muslim hate online and its effects. While the path to radicalization of the perpetrator and potential co-conspirators remains unclear, social media was weaponized to spread images of the attack, as well as anti-Muslim propaganda. The attacker linked to his Facebook livestream on 8chan, a platform where extremist content percolates, exploiting a connection between the world’s largest social network and the darkest corners of the internet that experts on extremism have warned about for years. The footage quickly spread across social media platforms. Facebook removed the original livestream after 20 minutes, but versions of the video continued to circulate on major tech platforms for hours. The attacks quickly became an extreme example of a problem that has been ongoing for years: the flourishing of anti-Muslim hate online. And it’s made possible, civil rights advocates and extremism watchdogs say, because large tech companies have largely ignored the problem.

There’s a reason that nudity and ISIS propaganda rarely show up on YouTube or Facebook: These are types of content that Silicon Valley has cracked down on. “When was the last time you were recommended an ISIS video on YouTube or Facebook?” tweeted NBC tech reporter Ben Collins after the New Zealand shooting. “The answer is probably never. That’s because law enforcement and tech companies made it a top priority.”

But when it comes to content that vilifies Muslims, Facebook hasn’t always been quick to remove it. In fact, several companies have repeatedly chosen to leave such content up, designating it a valid political viewpoint rather than dangerous hate speech.

Perhaps the best example is the Facebook account of Tommy Robinson, a British anti-Muslim activist who developed a large following on social media by highlighting crimes committed by Muslims in the UK. By late 2018, Robinson had more Facebook followers than British Prime Minister Theresa May. His following catapulted him to a position in Britain’s far-right UK Independence Party, and Politico described him as one of Britain’s leading political voices. In July 2018, just a few months after Zuckerberg’s testimony, an undercover investigation by the UK-based Channel 4 Dispatches, an investigative documentary series, revealed that Facebook protected far-right content if it came from activists with a large number of followers. Facebook had granted Robinson’s page the same special status given to government accounts, which meant that hateful content on his page could not be deleted by a regular moderator and instead had to be flagged to someone higher up at Facebook.

Facebook had given the same protection to Britain First, a defunct anti-Muslim political party. In 2017, Trump retweeted three Britain First videos purporting to show heinous crimes committed by Muslim immigrants. (One of the videos was a fake, another was footage from the Arab Spring uprising in Egypt years earlier, and the origin of the third origin was unverified.) Twitter banned Britain First a few months later, in December 2017, and then banned Robinson in March 2018, both for hateful conduct. Facebook was slower to react. It took down Britain First’s page in May 2018, after the group’s leaders were jailed for committing hate crimes against Muslims. By that point, the page had more than 2 million likes. Robinson’s page was banned in February 2019 for violating Facebook’s hate-speech policies. In a blog post, Facebook said it banned the page because Robinson had promoted violence against Muslims and repeatedly violated the platform’s “policies around organized hate.”

Anti-Muslim bigotry in the United States has also found a home on sites like Facebook, YouTube, and Twitter. In June 2017, the anti-Muslim hate group ACT for America organized anti-Shariah protests in cities around the country, alarming local mosques about the risk of violence. The group welcomed the security of vigilante right-wing militia groups, and some showed up armed at the rallies. 

ACT for America was one of the many hate groups that Muslim Advocates flagged to Facebook over the past five years. Recently, ACT for America’s Facebook page, which has more than 180,000 followers, has focused largely on anti-immigrant rhetoric. But it’s long been a place of violent threats against Muslims. Leading up to the 2017 protests, held during the holy month of Ramadan, comments on the group’s events included calls for violence against Muslims, according to the Southern Poverty Law Center, which classifies it as a hate group. In one image, former President Barack Obama has a noose around his neck alongside the caption “I look forward to this every day!!!” Another post stated, “Muslim men are brought up with cravings of rape, child sex, and sex slaves, etc.”  

The use of anti-Muslim rhetoric on platforms like Facebook isn’t limited to members of extremist groups. Last year, BuzzFeed found that since 2015, politicians in 49 states have openly attacked Muslims—often on social media. It is likely not a coincidence that Facebook has tended to treat anti-Muslim rhetoric as a valid political view. In 2017, Rep. Clay Higgins (R-La.) posted about Muslims on Facebook after a London terrorist attack: “Hunt them, identify them, and kill them. Kill them all.” Higgins’ spokesman told Mother Jones at the time that he was referring exclusively to terrorists.

Higgins’ comment remained on Facebook, a decision the company’s head of global policy, Monika Bickert, defended last month to Vanity Fair. “We really do want to give people room to share their political views, even when they are distasteful,” she said. Facebook has struggled to define the groups it protects against hateful language, and because Higgins was talking about radicalized Muslims, rather than all Muslims, Facebook did not consider the group a protected category. 

It remains to be seen whether the attacks in New Zealand prompt Facebook and other platforms to take anti-Muslim content more seriously. After the deadly white supremacist rally in Charlottesville in August 2017, multiple tech companies began to remove the most extreme white supremacist content from their platforms. But anti-Muslim content largely stayed online. At Facebook, “they were particularly horrible about anti-Muslim material on the site,” says Heidi Beirich, who tracks extremist groups at the Southern Poverty Law Center and worked with Facebook and other tech companies for years to get them to remove hate groups from the platform. “In other words, [Facebook was] very quick to rip down maybe a neo-Nazi account after Charlottesville. But if someone was bashing Muslims, that didn’t seem to rise to the level of hate.” 

By the time tech platforms moved to denounce the New Zealand attack and tried to take down the video, the footage and related content had already become viral. YouTube took down more than 1,000 videos related to the shooting on Friday. “Police alerted us to a video on Facebook shortly after the livestream commenced and we quickly removed both the shooter’s Facebook and Instagram accounts and the video,” a representative from a Facebook representative said on Twitter. “We’re also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.”

YouTube tweeted out a statement, saying, “Our hearts are broken over today’s terrible tragedy in New Zealand. Please know we are working vigilantly to remove any violent footage.”

Muslim Advocates also released a statement, expressing its devastation over the attack and urging Facebook and Google to take down the footage. But the group wasn’t surprised by the violence directed toward Muslims. “This heinous attack is not an anomaly or a surprise,” it said. “Over the past few years, there has been an epidemic of attacks and planned attacks on Muslim communities and mosques across the United States: mosques were bombed in Bloomington, Minnesota, and burned in Austin and Victoria, Texas, Bellevue, Washington, and Thonotosassa, Florida, and mass attacks were planned against Muslim communities in Islamberg, New York, Jacksonville, Florida, and Garden City, Kansas.”

AN IMPORTANT UPDATE

We’re falling behind our online fundraising goals and we can’t sustain coming up short on donations month after month. Perhaps you’ve heard? It is impossibly hard in the news business right now, with layoffs intensifying and fancy new startups and funding going kaput.

The crisis facing journalism and democracy isn’t going away anytime soon. And neither is Mother Jones, our readers, or our unique way of doing in-depth reporting that exists to bring about change.

Which is exactly why, despite the challenges we face, we just took a big gulp and joined forces with the Center for Investigative Reporting, a team of ace journalists who create the amazing podcast and public radio show Reveal.

If you can part with even just a few bucks, please help us pick up the pace of donations. We simply can’t afford to keep falling behind on our fundraising targets month after month.

Editor-in-Chief Clara Jeffery said it well to our team recently, and that team 100 percent includes readers like you who make it all possible: “This is a year to prove that we can pull off this merger, grow our audiences and impact, attract more funding and keep growing. More broadly, it’s a year when the very future of both journalism and democracy is on the line. We have to go for every important story, every reader/listener/viewer, and leave it all on the field. I’m very proud of all the hard work that’s gotten us to this moment, and confident that we can meet it.”

Let’s do this. If you can right now, please support Mother Jones and investigative journalism with an urgently needed donation today.

payment methods

AN IMPORTANT UPDATE

We’re falling behind our online fundraising goals and we can’t sustain coming up short on donations month after month. Perhaps you’ve heard? It is impossibly hard in the news business right now, with layoffs intensifying and fancy new startups and funding going kaput.

The crisis facing journalism and democracy isn’t going away anytime soon. And neither is Mother Jones, our readers, or our unique way of doing in-depth reporting that exists to bring about change.

Which is exactly why, despite the challenges we face, we just took a big gulp and joined forces with the Center for Investigative Reporting, a team of ace journalists who create the amazing podcast and public radio show Reveal.

If you can part with even just a few bucks, please help us pick up the pace of donations. We simply can’t afford to keep falling behind on our fundraising targets month after month.

Editor-in-Chief Clara Jeffery said it well to our team recently, and that team 100 percent includes readers like you who make it all possible: “This is a year to prove that we can pull off this merger, grow our audiences and impact, attract more funding and keep growing. More broadly, it’s a year when the very future of both journalism and democracy is on the line. We have to go for every important story, every reader/listener/viewer, and leave it all on the field. I’m very proud of all the hard work that’s gotten us to this moment, and confident that we can meet it.”

Let’s do this. If you can right now, please support Mother Jones and investigative journalism with an urgently needed donation today.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate