Shortcomings of current models of moderation have driven policy makers, scholars, and technologists to speculate about alternative models of content moderation. While alternative models provide hope for the future of online spaces, they can fail without proper scaffolding. Community moderators are routinely confronted with similar issues and have therefore found creative ways to navigate these challenges. Learning more about the decisions these moderators make, the challenges they face, and where they are successful can provide valuable insight into how to ensure alternative moderation models are successful. In this study, I perform a collaborative ethnography with moderators of r/AskHistorians, a community that uses an alternative moderation model, highlighting the importance of accounting for power in moderation. Drawing from Black feminist theory, I call this intersectional moderation. I focus on three controversies emblematic of r/AskHistorians' alternative model of moderation: a disagreement over a moderation decision; a collaboration to fight racism on Reddit; and a period of intense turmoil and its impact on policy. Through this evidence I show how volunteer moderators navigated multiple layers of power through care work. To ensure the successful implementation of intersectional moderation, I argue that designers should support decision-making processes and policy makers should account for the impact of the sociotechnical systems in which moderators work.
more »
« less
Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas
- Award ID(s):
- 1942125
- PAR ID:
- 10301295
- Date Published:
- Journal Name:
- Proceedings of the ACM on Human-Computer Interaction
- Volume:
- 5
- Issue:
- CSCW2
- ISSN:
- 2573-0142
- Page Range / eLocation ID:
- 1 to 35
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Research suggests that marginalized social media users face disproportionate content moderation and removal. However, when content is removed or accounts suspended, the processes governing content moderation are largely invisible, making assessing content moderation bias difficult. To study this bias, we conducted a digital ethnography of marginalized users on Reddit’s /r/FTM subreddit and Twitch’s “Just Chatting” and “Pools, Hot Tubs, and Beaches” categories, observing content moderation visibility in real time. We found that on Reddit, a text-based platform, platform tools make content moderation practices invisible to users, but moderators make their practices visible through communication with users. Yet on Twitch, a live chat and streaming platform, content moderation practices are visible in channel live chats, “unban appeal” streams, and “back from my ban” streams. Our ethnography shows how content moderation visibility differs in important ways between social media platforms, harming those who must see offensive content, and at other times, allowing for increased platform accountability.more » « less
-
Transgender and nonbinary social media users experience disproportionate content removals on social media platforms, even when content does not violate platforms’ guidelines. In 2022, the Oversight Board, which oversees Meta platforms’ content moderation decisions, invited public feedback on Instagram’s removal of two trans users’ posts featuring their bare chests, introducing a unique opportunity to hear trans users’ feedback on how nudity and sexual activity policies impacted them. We conducted a qualitative analysis of 83 comments made public during the Oversight Board’s public comment process. Commenters criticized Meta’s nudity policies as enforcing a cisnormative view of gender while making it unclear how images of trans users’ bodies are moderated, enabling the disproportionate removal of trans content and limiting trans users’ ability to use Meta’s platforms. Yet there was significant divergence among commenters about how to address cisnormative moderation. Some commenters suggested that Meta clarify nudity guidelines, while others suggested that Meta overhaul them entirely, removing gendered distinctions or fundamentally reconfiguring the platform’s relationship to sexual content. We then discuss how the Oversight Board’s public comment process demonstrates the value of incorporating trans people’s feedback while developing policies related to gender and nudity, while arguing that Meta must go beyond only revising policy language by reevaluating how cisnormative values are encoded in all aspects of its content moderation systems.more » « less
An official website of the United States government

