Extremist Speech, Compelled Conformity, and Censorship Creep

Citation data:

93 Notre Dame Law Review 1035 (2018)

Publication Year:
Usage 2892
Abstract Views 2472
Downloads 420
Captures 1
Readers 1
Mentions 5
News Mentions 4
Blog Mentions 1
Social Media 26
Tweets 26
Danielle Keats Citron
counterterrorism; social media; EU; hate speech; radicalization; fake news; European Internet Forum; censorship creep; terms of service; transparency; hash; free expression
Most Recent Tweet View All Tweets
Most Recent Blog Mention
Most Recent News Mention
paper description
Silicon Valley has long been viewed as a full-throated champion of First Amendment values. The dominant online platforms, however, have recently adopted speech policies and processes that depart from the U.S. model. In an agreement with the European Commission, tech companies have pledged to respond to reports of hate speech within twenty-four hours, a hasty process that may trade valuable expression for speedy results. Plans have been announced for an industry database that will allow the same companies to share hashed images of banned extremist content for review and removal elsewhere. These changes are less the result of voluntary market choices than a bowing to governmental pressure. Private speech rules and policies about extremist content have been altered to stave off threatened European regulation. Far more than illegal hate speech or violent terrorist imagery is in EU lawmakers’ sights, so too is online radicalization and “fake news.” Newsworthy content may end up being removed along with terrorist beheading videos, “kill lists” of U.S. servicemen, and instructions on how to blow up houses of worship. The impact of extralegal coercion will be far reaching. Unlike national laws that are limited by geographic borders, terms-of-service agreements apply to platforms’ services on a global scale. Whereas local courts can only order platforms to block material viewed in their jurisdictions, a blacklist database raises the risk of total censorship. Companies should counter the serious potential for censorship creep with definitional clarity, robust accountability, detailed transparency, and ombudsman oversight.