Citation

Content Moderation as Systems Thinking

Author:
Douek, Evelyn
Publication:
Harvard law review
Year:
2022

The stylized picture of content moderation that forms the basis for most regulatory and academic discussion of online speech governance is misleading and incomplete. That picture depicts content moderation as a rough online analog of offline judicial adjudication of speech rights, with legislative-style substantive rules being applied over and over again to individual pieces of content by a hierarchical bureaucracy of moderators. This understanding leads regulators and scholars to assume that the best way to make platforms accountable for their decisions about online speech is to ensure platforms provide users the kind of ex post individual review provided by courts in First Amendment cases and to guarantee users with ever more due process rights. But because the scale and speed of online speech means content moderation cannot be understood as simply the aggregation of many (many!) individual adjudications, what this approach produces is accountability theater rather than actual accountability. This Article argues that content moderation should instead be understood as a project of mass speech administration and that looking past a post-by-post evaluation of platform decisionmaking reveals a complex and dynamic system that needs a more proactive and continuous form of governance than the vehicle of individual error correction allows. Lawmakers need to embrace a second wave of regulatory thinking about content moderation institutional design that eschews comforting but illusory First Amendment–style analogies and instead adopts a systems thinking approach. This approach focuses on the need to look to structural and procedural mechanisms that target the key ex ante and systemic decisionmaking that occurs upstream of any individual case.