THE INFLUENCE AND proliferation of extremist content, hate speech, and state-sponsored propaganda on the internet has risen around the globe, as demonstrated by Russia’s involvement in the US election and the rise of ISIS recruitment online. As a result, the pressure that governments, media, and civil society are placing on technology companies to take meaningful action to stem the flow of this content is at an all-time high.
A recent law passed in Germany will require social media companies like Facebook and Twitter to remove illegal, racist, or slanderous content within 24 hours after it’s flagged by a user, or face fines as large as $57 million. Although this legislation was passed overseas, its effects will be felt stateside, as the sites that will bear the brunt of the law are American. Furthermore, while similar legislation in the US is unlikely due to the country’s strong First Amendment culture, a recent Canadian court ruling ordered content that violated Canadian law should be deleted globally rather than just for Canadian users, opening the door to extraterritorial regulation that could affect American consumers.
Although governments have a legitimate interest in ensuring the safety of their citizens online, laws like this are not the answer. Government legislation is a blunt tool that is likely to compound problems, not solve them.
Legislation or regulations requiring companies to remove content pose a range of risks, including potentially legitimizing repressive measures from authoritarian regimes. Hate speech, political propaganda, and extremist content are subjective, and interpretations vary widely among different governments. Relying on governments to create and enforce regulations online affords them the opportunity to define these terms as they see fit. Placing the power in the hands of governments also increases the likelihood that authoritarian regimes that lack Germany’s liberal democratic tradition will criminalize online content critical of those governments and, ultimately, create another mechanism for oppressing their own citizens.
An individual’s right to freedom of expression is wholly dependent on geography. The internet has provided an unprecedented means for users to share ideas and connect in a manner that transcends borders. This freedom is not unrestricted — and there are valid reasons why certain content, such as child pornography, should have no place on the web.
However, imposing hefty financial penalties on internet platforms, as the new German law does, all but ensures that certain companies will err on the side of excessive censorship, unfairly limiting the right to free speech. Government-prompted censorship of this type imposes barriers and cannibalizes the freedoms the internet was designed to provide.
The approach taken by this new German law places the primary burden of determining and enforcing the legality of content online onto private companies that host internet platforms. Under this model, these companies will be forced to adopt a quasi-judicial function, which is problematic. The platforms may use rules to police content that lack the clarity, protections, and appellate procedures that the rule of law requires.
Instead of government intervention, civil society should recognize and build upon the efforts of platforms that address these issues, while also pressing companies to step up to do even more. Recent affirmative examples of company-led initiatives include Facebook’s hiring of 3,000 more content reviewers to address violent posts on its site and Google’s development of machine-learning systems to identify and remove hate speech and extremist content. YouTube also has implemented a policy whereby violent content that does not meet the company’s community guidelines for removal will be stripped of engagement tools.
There is no doubt these companies can, and should, do more. But the future of an open internet and freedom of speech depends on the restraint of governments and a resistance to the idea that they should be dictating content. As an alternative, governments and companies need to utilize the multi-stakeholder model that has helped the internet grow and prosper.
Online content from violent extremist groups and foreign governments that use the internet to spread false information and propaganda cause real harm. In that context, companies such as Facebook, Google, Twitter, and Microsoft have an opportunity to work together more closely, as well as with civil society organizations, governments, and academics. Together, these stakeholders need to develop scalable and transparent internal governance structures that will enable them to continue making healthy profits while mitigating the damage done by such content.
An earlier version of this essay appeared on the Stern Center’s blog.
Source:-wired