skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: ALLOCATING RESPONSIBILITY IN CONTENT MODERATION: A FUNCTIONAL FRAMEWORK
This Article develops a framework for both assessing and designing content moderation systems consistent with public values. It argues that moderation should not be understood as a single function, but as a set of subfunctions common to all content governance regimes. By identifying the particular values implicated by each of these subfunctions, it explores the appropriate ways the constituent tasks might best be allocated-specifically to which actors (public or private, human or technological) they might be assigned, and what constraints or processes might be required in their performance. This analysis can facilitate the evaluation and design of content moderation systems to ensure the capacity and competencies necessary for legitimate, distributed systems of content governance. Through a combination of methods, legal schemes delegate at least a portion of the responsibility for governing online expression to private actors. Sometimes, statutory schemes assign regulatory tasks explicitly. In others, this delegation often occurs implicitly, with little guidance as to how the treatment of content should be structured. In the law's shadow, online platforms are largely given free rein to configure the governance of expression. Legal scholarship has surfaced important concerns about the private sector's role in content governance. In response, private platforms engaged in content moderation have adopted structures that mimic public governance forms. Yet, we largely lack the means to measure whether these forms are substantive, effectively infusing public values into the content moderation process, or merely symbolic artifice designed to deflect much needed public scrutiny. This Article's proposed framework addresses that gap in two ways. First, the framework considers together all manner of legal regimes that induce platforms to engage in the function of content moderation. Second, it focuses on the shared set of specific tasks, or subfunctions, involved in the content moderation function across these regimes. Examining a broad range of content moderation regimes together highlights the existence of distinct common tasks and decision points that together constitute the content moderation function. Focusing on this shared set of subfunctions highlights the different values implicated by each and the way they can be "handed off' to human and technical actors to perform in different ways with varying normative and political implications. This Article identifies four key content moderation subfunctions: (1) definition of policies, (2) identification of potentially covered content, (3) application of policies to specific cases, and (4) resolution of those cases. Using these four subfunctions supports a rigorous analysis of how to leverage the capacities and competencies of government and private parties throughout the content moderation process. Such attention also highlights how the exercise of that power can be constrained-either by requiring the use of particular decision-making processes or through limits on the use of automation-in ways that further address normative concerns. Dissecting the allocation of subfunctions in various content moderation regimes reveals the distinct ethical and political questions that arise in alternate configurations. Specifically, it offers a way to think about four key questions: (1) what values are most at issue regarding each subfunction; (2) which activities might be more appropriate to delegate to particular public or private actors; (3) which constraints must be attached to the delegation of each subfunction; and (4) where can investments in shared content moderation infrastructures support relevant values? The functional framework thus provides a means for both evaluating the symbolic legal forms that firms have constructed in service of content moderation and for designing processes that better reflect public values.  more » « less
Award ID(s):
1650589
PAR ID:
10393008
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Berkeley technology law journal
Volume:
36
Issue:
3
ISSN:
1086-3818
Page Range / eLocation ID:
1091-1172
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The speed and uncertainty of environmental change in the Anthropocene challenge the capacity of coevolving social–ecological–technological systems (SETs) to adapt or transform to these changes. Formal government and legal structures further constrain the adaptive capacity of our SETs. However, new, self-organized forms of adaptive governance are emerging at multiple scales in natural resource-based SETs. Adaptive governance involves the private and public sectors as well as formal and informal institutions, self-organized to fill governance gaps in the traditional roles of states. While new governance forms are emerging, they are not yet doing so rapidly enough to match the pace of environmental change. Furthermore, they do not yet possess the legitimacy or capacity needed to address disparities between the winners and losers from change. These emergent forms of adaptive governance appear to be particularly effective in managing complexity. We explore governance and SETs as coevolving complex systems, focusing on legal systems to understand the potential pathways and obstacles to equitable adaptation. We explore how governments may facilitate the emergence of adaptive governance and promote legitimacy in both the process of governance despite the involvement of nonstate actors, and its adherence to democratic values of equity and justice. To manage the contextual nature of the results of change in complex systems, we propose the establishment of long-term study initiatives for the coproduction of knowledge, to accelerate learning and synergize interactions between science and governance and to foster public science and epistemic communities dedicated to navigating transitions to more just, sustainable, and resilient futures. 
    more » « less
  2. The increasing harms caused by hate, harassment, and other forms of abuse online have motivated major platforms to explore hierarchical governance. The idea is to allow communities to have designated members take on moderation and leadership duties; meanwhile, members can still escalate issues to the platform. But these promising approaches have only been explored in plaintext settings where community content is public to the platform. It is unclear how one can realize hierarchical governance in the huge and increasing number of online communities that utilize end-to-end encrypted (E2EE) messaging for privacy. We propose the design of private, hierarchical governance systems. These should enable similar levels of community governance as in plaintext settings, while maintaining cryptographic privacy of content and governance actions not reported to the platform. We design the first such system, taking a layered approach that adds governance logic on top of an encrypted messaging protocol; we show how an extension to the message layer security (MLS) protocol suffices for achieving a rich set of governance policies. Our approach allows developers to rapidly prototype new governance features, taking inspiration from a plaintext system called PolicyKit. We report on an initial prototype encrypted messaging system called MlsGov that supports content-based community and platform moderation, elections of community moderators, votes to remove abusive users, and more. 
    more » « less
  3. Modern social media platforms like Twitch, YouTube, etc., embody an open space for content creation and consumption. However, an unintended consequence of such content democratization is the proliferation of toxicity and abuse that content creators get subjected to. Commercial and volunteer content moderators play an indispensable role in identifying bad actors and minimizing the scale and degree of harmful content. Moderation tasks are often laborious, complex, and even if semi-automated, they involve high-consequence human decisions that affect the safety and popular perception of the platforms. In this paper, through an interdisciplinary collaboration among researchers from social science, human-computer interaction, and visualization, we present a systematic understanding of how visual analytics can help in human-in-the-loop content moderation. We contribute a characterization of the data-driven problems and needs for proactive moderation and present a mapping between the needs and visual analytic tasks through a task abstraction framework. We discuss how the task abstraction framework can be used for transparent moderation, design interventions for moderators’ well-being, and ultimately, for creating futuristic human-machine interfaces for data-driven content moderation. 
    more » « less
  4. Transgender and nonbinary social media users experience disproportionate content removals on social media platforms, even when content does not violate platforms’ guidelines. In 2022, the Oversight Board, which oversees Meta platforms’ content moderation decisions, invited public feedback on Instagram’s removal of two trans users’ posts featuring their bare chests, introducing a unique opportunity to hear trans users’ feedback on how nudity and sexual activity policies impacted them. We conducted a qualitative analysis of 83 comments made public during the Oversight Board’s public comment process. Commenters criticized Meta’s nudity policies as enforcing a cisnormative view of gender while making it unclear how images of trans users’ bodies are moderated, enabling the disproportionate removal of trans content and limiting trans users’ ability to use Meta’s platforms. Yet there was significant divergence among commenters about how to address cisnormative moderation. Some commenters suggested that Meta clarify nudity guidelines, while others suggested that Meta overhaul them entirely, removing gendered distinctions or fundamentally reconfiguring the platform’s relationship to sexual content. We then discuss how the Oversight Board’s public comment process demonstrates the value of incorporating trans people’s feedback while developing policies related to gender and nudity, while arguing that Meta must go beyond only revising policy language by reevaluating how cisnormative values are encoded in all aspects of its content moderation systems. 
    more » « less
  5. Abstract As municipalities across the global North highlight urban agriculture as a marker of their ‘greenness’, how can we best understand how the spaces and practices of urban food production are governed? This article develops an analysis of urban agriculture as a complex site of governance in which numerous interests engage. We underscore the politics of governance, through which some actors resist the imposition of a narrowly normative and exclusive notion of urban agriculture and against which they envision and enact alternatives. The article contributes to efforts to transcend the often dichotomous framing of urban agriculture as radical or neoliberal, formal or informal, political or post‐political by employing ‘everyday governance’ and ‘everyday resistance’ as lenses through which to focus on the prosaic practices of engaging with, pushing back against, and stepping beyond the imposition of hegemonic models of urban agriculture. We argue that the co‐constitutive, ‘braided’ nature of urban agricultural governance is revealed through attention to the manifold forms of negotiation and resistance to formal urban agricultural governance. Moreover, our perspective highlights the ways that some practitioners are excluded by, challenge, or re‐vision formal definitions of urban agriculture. We draw on the cases of Portland, OR and Vancouver, BC to illustrate our argument. 
    more » « less