David Kaye, the United Nations special rapporteur on the promotion and protection of the freedom of opinion and expression, recommended in June 2018 that social media companies adopt international human rights law as the authoritative standard for their content moderation. Before Kaye’s report, the idea was fairly out of the mainstream. But the ground has shifted.
Evelyn Douek is an S.J.D. candidate at Harvard Law School, studying international and transnational regulation of online speech. Before coming to Harvard to complete a Master of Laws, Evelyn clerked for the Chief Justice of the High Court of Australia, the Hon. Justice Susan Kiefel, and worked as a corporate litigator. She received her LL.B. from UNSW Sydney, where she was Executive Editor of the UNSW Law Journal.
Subscribe to this Lawfare contributor via RSS.
Last week, Facebook announced the final charter for its independent Oversight Board, which will have the power to hear cases and overrule Facebook’s decisions about what can and cannot remain on Facebook’s platforms. This is a potentially pivotal moment in the history of online speech and an unprecedented innovation in private platform governance.
Verified Accountability: Self-Regulation of Content Moderation as an Answer to the Special Problems of Speech Regulation
The “techlash” of the past few years represents a moment of quasi-constitutional upheaval for the internet. The way a few private companies have been “governing” large parts of the digital world has suffered a crisis of legitimacy. Calls to find mechanisms to limit the arbitrary exercise of power online have gained new urgency. This task of “digital constitutionalism” is one of the great projects of the coming decades. It is especially pressing in the context of content moderation – platforms’ practice of designing and enforcing rules for what they allow to be posted on their services.
Amid privacy scandals, sweeping disinformation operations and links to ethnic cleansing, a reasonable person could be forgiven for wondering lately: “What is the point of Facebook?” Now the world has Facebook’s answer to that questio
Facebook has released an update on its ongoing civil rights audit, illustrating the wide range of effects the company has on civil rights—from facilitating racially discriminatory ads for housing, employment and credit, to concerns about use of the platform to suppress participation in the 2020 U.S. election and census.
It’s been roughly six months since Facebook started collecting global feedback on its proposal to create an oversight board for content moderation decisions. This morning, the platform released the findings of that process in an epic report—almost 250 pages of summary, surveys, public comment, workshop feedback and expert consultations.
The techlash has well and truly arrived on YouTube’s doorstep. On June 3, the New York Times reported on research showing that YouTube’s recommendation algorithm serves up videos of young people to viewers who appear to show sexual interest in children.