A slow-motion fiasco over whether a right-wing commentator violated YouTube’s harassment and hate speech policies illustrates how different platforms struggle to resolve disputes about what they allow on their services.
Latest in Social Media
Online extremists are forced to balance public outreach and operational security in choosing which digital tools to utilize.
In the past two years, a number of companies have invoked international law justifications to decline to make their products available to states that, in their view, will use those products to violate international law.
The French report is a cautious survey of how to manage government regulation of speech in the new platform era, while the Christchurch Call is a high-level pledge to prevent the abuse of an open internet.
There are lots of ways to cover tech platforms, but the past few decades have shown that the techniques most effective at moving law and policy forward take time and a scientific rigor.
The post-Christchurch law creates new offenses and liability, including imprisonment and huge fines for failing to take down violent content. But it is riddled with ambiguities.
A look at Facebook’s content moderation appeals body.
Germany’s Network Enforcement Act (NEA) places strict requirements on “social network providers” to remove illegal content and respond to complaints.
By providing concrete recommendations and clear forewarning of upcoming pressure-points, the report leaves no room for doubt that Facebook has a lot of work to do.
The demise of the far-right social media platform in the wake of the Pittsburgh shooting reflects a shift toward greater involvement by technology companies in policing the content that appears using their services.