While the result is an improvement, for now, Cloudflare should have terminated the site’s account back in March after the shooting in Christchurch, New Zealand, despite feeling “incredibly uncomfortable about playing the role of content arbiter.” Internet companies have responsibilities beyond connectivity and must do better at assuming them. Hosting providers and social media platforms should refuse to be megaphones for speech that supports violent extremist ideologies. Platforms should remove all content that clearly incites violence, no matter who posts it, as well as white supremacist and nationalist content. When platforms refuse to remove this content, as 8chan did, hosting companies should refuse to provide them service.
And if the rogue websites do find a home, the major social media platforms like Facebook, Google and Twitter should work together to deny access to these sites from their platforms. This would deny white supremacists the opportunity to spread their violent extremist ideology to the broader user bases of the larger platforms. Google has already removed 8chan from its search results.
The prevailing ethos of internet hosting and connectivity companies has long been to leave content up and sites connected — preserving users’ rights to free expression. But with white supremacy becoming a growing terror threat around the world, it’s time to think through under what circumstances it is acceptable to allow such content to stay online.
Of course, internet companies don’t look the other way for all content. Since passage of the
Protection of Children from Sexual Predators Act in 1998, they have been required to report child pornography to the National Center for Missing and Exploited Children. And for sex trafficking and other content that violates federal criminal law, or intellectual property violations, these companies do not have immunity under
Section 230 of the Communications Decency Act, which famously protects them from liability for all other content they carry. As a result, internet companies do in fact regularly take down child pornography, sex trafficking and Islamic terrorist content when they become aware of it — and report it to the FBI.
But it wasn’t until after the shooting in Christchurch that
Facebook banned white nationalist content. And though
Twitter has said it has policies to fight hate and the incitement of violence, it provides verified accounts to 8chan and
Gab, a site frequented by extremists, including the suspected gunman responsible for the Pittsburgh synagogue shooting.
For accountability, all these companies — hosting companies and social media platforms — will need to clarify their terms of service and delineate which types of content and accounts they take down and provide a process for appealing their decisions.
The US government should also collaborate with other governments to identify threats posed by international networks of white supremacists, like the ones that inspired the Christchurch, New Zealand shooter, and maintain a spotlight on company action and inaction. The governments should provide internet companies access to information on the emerging threats, but in the end, it is the responsibility of the companies to act. New Zealand Prime Minister Jacinda Ardern and French President Emmanuel Macron convened the
Christchurch Call to Action as an effort to force the hands of the companies to state what they would do to combat online terrorist and violent extremist content. The United States — home to many of the internet giants — didn’t even join the countries that signed onto the Call. When Canadian foreign minister Chrystia Freeland urged the G7 and the UN to lead on these issues, the
United States stayed silent.
The United States should be providing leadership on how to both combat white supremacy and protect democratic values. The governments and companies signing onto the
Christchurch Call emphasized “principles of a free, open and secure internet, without compromising human rights and fundamental freedoms, including freedom of expression.” Those are exactly the right guardrails.
Following this month’s terrible events, it’s important to take the white supremacist terror threat as seriously as we take foreign terrorist threats — and we must do so before it becomes a persistent feature of the internet that can no longer be taken down.