Jan. 6, 2021 Violence in the Capitol
Facebook Has Referred Trump’s Suspension to Its Oversight Board. Now What?
Don’t expect high drama or fireworks. But this could signal a substantial change in how the platform approaches content moderation.
Evelyn Douek is a Lecturer on Law and S.J.D. candidate at Harvard Law School, and Affiliate at the Berkman Klein Center for Internet & Society. She studies online speech regulation and platform governance. Before coming to Harvard to complete a Master of Laws, Evelyn clerked for the Chief Justice of the High Court of Australia, the Hon. Justice Susan Kiefel, and worked as a corporate litigator. She received her LL.B. from UNSW Sydney, where she was Executive Editor of the UNSW Law Journal.
Subscribe to this Lawfare contributor via RSS.
Don’t expect high drama or fireworks. But this could signal a substantial change in how the platform approaches content moderation.
Checks and balances don’t exist only for decisions people agree with. Facebook should allow oversight of its most high-profile content moderation decision yet.
The conspiracy theory posed genuine danger, but Twitter’s action does not signal a new era of accountability for big technology platforms.
The pandemic is shaping up to be a formative moment for tech platforms.
The company’s new white paper is a thoughtful document that raises serious questions that regulators, and the rest of us interested in the future of online content regulation, need to reckon with.
The House Ethics Committee has announced that members who share deepfakes or “other audio-visual distortions intended to mislead the public” could face sanctions. It’s a small but noteworthy step.
The new bylaws include a number of promising signs about Facebook’s commitment to the Oversight Board experiment. But the board’s original ambit of operations will be fairly limited.