A look at Facebook’s content moderation appeals body.
Latest in Social Media
Germany’s Network Enforcement Act (NEA) places strict requirements on “social network providers” to remove illegal content and respond to complaints.
By providing concrete recommendations and clear forewarning of upcoming pressure-points, the report leaves no room for doubt that Facebook has a lot of work to do.
The demise of the far-right social media platform in the wake of the Pittsburgh shooting reflects a shift toward greater involvement by technology companies in policing the content that appears using their services.
Recent reporting on how the Myanmar military harnessed Facebook to disseminate anti-Rohingya propaganda complicates the ongoing debate about Facebook’s role in facilitating atrocities.
In a new paper in the Hoover Aegis series, we take stock of the changing regulatory environment around large technology platforms and examine both the positive potential and the dangers of legislative and technological solutions to the problems of content moderation.
Assessing the costs and benefits of three approaches to fighting the dissemination of terrorist propaganda.
Opportunities to glimpse misinformation in action are fairly rare. But after the recent attack in Toronto, a journalist on Twitter unwittingly carried out a natural experiment that shows how quickly “fake news” can spread.
The Islamic State and its supporters can skirt content regulations with media that doesn't explicitly advocate terrorism. Is there more that can be done?
Despite what Ted Cruz suggested to Mark Zuckerberg during last week’s Facebook hearings, there is no requirement that a platform remain neutral in order to maintain Section 230 immunity.