Whether it’s shaming somebody on social media or posting a provocative article online, technology is commonly used to spread opinions for both virtuous and nefarious purposes. Following the protests that happened in Charlottesville, Virginia on the 11th-12th August, technology companies and members of the public have taken actions to influence the spread of material perpetuated by white nationalists on the internet, with potentially serious consequences regarding free speech.
Social media is the quickest way to share an opinion to a large audience, with an analysis from Smart Insights showing that over 2.75 billion people engage with a mix of social media platforms, such as Facebook and Twitter. Whilst most of the posts on these platforms are harmless, some posts intend to cause distress to others and come from users who support controversial ideologies, such as the alt-right. To reduce the impact of these posts, they are removed by moderators of social media platforms.
Users of social media have also posted comments with the intention to shame and discredit known white nationalists…
Following the protests in Charlottesville, Facebook and other sites have deleted a large number of accounts held by white nationalists, as well as groups that promote the ideology. This was with the intention of limiting the spread of these harmful beliefs by removing the platforms to post on. Users of social media have also posted comments with the intention to shame and discredit known white nationalists, in the hopes that they will refrain from posting more harmful content.
Prior to the protests, websites were also used by the alt-right to organise gatherings and share opinions, the most infamous example being The Daily Stormer. However, after a controversial article was published that made fun of Heather Heyer, a woman who was fatally run over by an alt-right protestor, domain registrar and web hosting company GoDaddy ended its services with The Daily Stormer. GoDaddy citied a violation of its terms of service as its reason for this. At the time of writing, the owner of The Daily Stormer has claimed to have been rejected by 7 domain registrars, including Google, RU-CENTER in Russia and Namecheap. It currently lies in the dark web, only accessible through Tor, where it is still running and publishing stories, albeit to a smaller audience.
It raises further questions over who should have the right to limit content on the internet and how far it could go…
This level of regulation from tech companies has never been seen before, and these companies are being warned to tread lightly. The Electronic Frontier Foundation, a digital rights group, have warned that similar tactics could be used to silence other groups of people whom the government may disagree with, and wrote in a blog on their website that “protecting free speech is not something we do because we agree with all of the speech that gets protected. We do it because we believe that no one … should decide who gets to speak and who doesn’t.” While it is a positive step to remove hate websites and to remove harmful content from social media, it raises further questions over who should have the right to limit content on the internet and how far it could go, with critics accusing GoDaddy and Google of silencing free speech and submitting to pressure from the media.
In addition, users and moderators of social media have been accused of hypocrisy for shaming white nationalists and deleting accounts and groups. This is because the response against the alt-right has been to attack their beliefs and ideology, which is the same thing that the alt-right were accused of in the first place. It is believed that these actions could eventually lead to situations where, rather than holding debates over clashing opinions, the minority will be silenced by the majority and refused the chance to defend their opinion.
Often, it is only by allowing others to defend their beliefs that they learn to question them…
Overall, the decision from tech companies and members of the public to limit the spread of these toxic beliefs is beneficial in the short term, because it reduces the amount of harmful content on the internet and will therefore decrease the chances of causing distress to other users. However, there is a very thin line between removing harmful content such as was the case with Charlottesville, and the potential to stifle free speech. We should be wary of crossing this line, as it may result in innocent users being removed from these platforms without the chance to defend themselves in debates less ‘clean-cut’ as the one concerning white supremacists. It could also have a detrimental effect in the long term on our ability to hold healthy debates over issues, as rather than reasoning with someone holding alternative beliefs, we are learning to silence them. Often, it is only by allowing others to defend their beliefs that they learn to question them. Lack of this could lead to an increase in intolerance, which is what tech companies were aiming to eradicate in the first place.