Platforms’ ability to assess the context of content plays a major role in determining whether “new school regulation” sets proportional limits to freedom of speech.
Latest in aegis
Verified Accountability: Self-Regulation of Content Moderation as an Answer to the Special Problems of Speech Regulation
The “techlash” of the past few years represents a moment of quasi-constitutional upheaval for the internet. The way a few private companies have been “governing” large parts of the digital world has suffered a crisis of legitimacy. Calls to find mechanisms to limit the arbitrary exercise of power online have gained new urgency. This task of “digital constitutionalism” is one of the great projects of the coming decades. It is especially pressing in the context of content moderation – platforms’ practice of designing and enforcing rules for what they allow to be posted on their services.
This Lawfare post summarizes a longer essay we are publishing today with the Hoover Working Group on National Security, Technology and Law. Our essay addresses whether governments ever have a justified basis for treating targets of surveillance differently, in any way, based on nationality. This issue is of general importance and has become particularly important in the current legal debates about whether the U.S.
As I noted in my post yesterday, the Chinese government has declined to clarify how and whether it believes the international law governing the use of applies to cyber warfare. Its refusal to do so has drawn sharp criticism from the U.S. and other cyber powers.
Here's video of my interview last week with General Michael Hayden on his new book, Playing to the Edge: American Intelligence in the Age of Terror. If you missed the live event, we'll also be podcasting the audio this week.