Shortcomings of current models of moderation have driven policy makers, scholars, and technologists to speculate about alternative models of content moderation. While alternative models provide hope for the future of online spaces, they can fail without proper scaffolding. Community moderators are routinely confronted with similar issues and have therefore found creative ways to navigate these challenges. Learning more about the decisions these moderators make, the challenges they face, and where they are successful can provide valuable insight into how to ensure alternative moderation models are successful. In this study, I perform a collaborative ethnography with moderators of r/AskHistorians, a community that uses an alternative moderation model, highlighting the importance of accounting for power in moderation. Drawing from Black feminist theory, I call this intersectional moderation. I focus on three controversies emblematic of r/AskHistorians' alternative model of moderation: a disagreement over a moderation decision; a collaboration to fight racism on Reddit; and a period of intense turmoil and its impact on policy. Through this evidence I show how volunteer moderators navigated multiple layers of power through care work. To ensure the successful implementation of intersectional moderation, I argue that designers should support decision-making processes and policy makers should account for the impact of the sociotechnical systems in which moderators work.
more »
« less
Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas
- Award ID(s):
- 1942125
- PAR ID:
- 10301295
- Date Published:
- Journal Name:
- Proceedings of the ACM on Human-Computer Interaction
- Volume:
- 5
- Issue:
- CSCW2
- ISSN:
- 2573-0142
- Page Range / eLocation ID:
- 1 to 35
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Research suggests that marginalized social media users face disproportionate content moderation and removal. However, when content is removed or accounts suspended, the processes governing content moderation are largely invisible, making assessing content moderation bias difficult. To study this bias, we conducted a digital ethnography of marginalized users on Reddit’s /r/FTM subreddit and Twitch’s “Just Chatting” and “Pools, Hot Tubs, and Beaches” categories, observing content moderation visibility in real time. We found that on Reddit, a text-based platform, platform tools make content moderation practices invisible to users, but moderators make their practices visible through communication with users. Yet on Twitch, a live chat and streaming platform, content moderation practices are visible in channel live chats, “unban appeal” streams, and “back from my ban” streams. Our ethnography shows how content moderation visibility differs in important ways between social media platforms, harming those who must see offensive content, and at other times, allowing for increased platform accountability.more » « less
-
The social media industry has begun more prominently positioning itself as a vehicle for tapping into local community. Facebook offers hundreds of region-specific community groups, proudly touting these in nation-wide commercials. Reddit has hundreds of subreddits focused on specific states, cities, and towns. And Nextdoor encourages users to sign up and “Get the most out of your neighborhood.” In these locally oriented digital spaces, users interact, discuss community issues, and share information about what is happening around them. Volunteer moderators with localized knowledge are important agents in the creation, maintenance, and upkeep of these digital spaces. And, as we show, Facebook, Reddit, and Nextdoor create strategic communication to guide this localized volunteer moderator labor to realize specific goals within these spaces. In this work, we ask: “What are the promises the social media industry make about local community groups, and how do they position volunteer moderators to help realize those promises?” Through a qualitative content analysis of 849 documents produced by Facebook, Reddit, and NextDoor, we trace how platforms position their version of local community as slightly different utopian spaces, and channel volunteer moderator labor both through direct instruction and appeals to civic virtue.more » « less
An official website of the United States government

