The post-Christchurch law creates new offenses and liability, including imprisonment and huge fines for failing to take down violent content. But it is riddled with ambiguities.
Latest in Social Media
A look at Facebook’s content moderation appeals body.
Germany’s Network Enforcement Act (NEA) places strict requirements on “social network providers” to remove illegal content and respond to complaints.
By providing concrete recommendations and clear forewarning of upcoming pressure-points, the report leaves no room for doubt that Facebook has a lot of work to do.
The demise of the far-right social media platform in the wake of the Pittsburgh shooting reflects a shift toward greater involvement by technology companies in policing the content that appears using their services.
Recent reporting on how the Myanmar military harnessed Facebook to disseminate anti-Rohingya propaganda complicates the ongoing debate about Facebook’s role in facilitating atrocities.
In a new paper in the Hoover Aegis series, we take stock of the changing regulatory environment around large technology platforms and examine both the positive potential and the dangers of legislative and technological solutions to the problems of content moderation.
Assessing the costs and benefits of three approaches to fighting the dissemination of terrorist propaganda.
Opportunities to glimpse misinformation in action are fairly rare. But after the recent attack in Toronto, a journalist on Twitter unwittingly carried out a natural experiment that shows how quickly “fake news” can spread.