Residents of communities increasingly rely on geographically focused groups on online social media platforms to access local information. These local groups have the potential to enhance the quality of life in communities by helping residents learn about their communities, connect with neighbors and local organizations, and identify important local issues. Moderators of online community groups—typically untrained volunteers—are key actors in these spaces. However, they are also put in a tenuous position, having to manage the groups while simultaneously navigating desires of platforms, rapidly evolving user practices, and the increasing politicization of local issues. In this paper, we explicate the visions of local community groups put forward by Facebook, Reddit, and NextDoor in their corporate discourse and ask: How do these platforms describe local community groups, particularly in reference to ideal communication and community engagement that occurs within them, and how do they position volunteer moderators to help realize these ideals? Through a qualitative thematic analysis of 849 company documents published between 2012 and 2023, we trace how each company rhetorically positions these spaces as what we refer to as a “local platformized utopias.” We examine how this discourse positions local volunteer moderators, the volunteer labor-force of civic actors that constructs, governs, and grows community groups. We discuss how these three social media companies motivate moderators to do this free, value-building labor through the promise of civic virtue; simultaneously obscuring unequal burdens of moderation labor and failing to address the inequalities of access to voice and power in online life.
more »
« less
LOCALIZED VOLUNTEER MODERATION AND ITS DISCURSIVE CONSTRUCTION
The social media industry has begun more prominently positioning itself as a vehicle for tapping into local community. Facebook offers hundreds of region-specific community groups, proudly touting these in nation-wide commercials. Reddit has hundreds of subreddits focused on specific states, cities, and towns. And Nextdoor encourages users to sign up and “Get the most out of your neighborhood.” In these locally oriented digital spaces, users interact, discuss community issues, and share information about what is happening around them. Volunteer moderators with localized knowledge are important agents in the creation, maintenance, and upkeep of these digital spaces. And, as we show, Facebook, Reddit, and Nextdoor create strategic communication to guide this localized volunteer moderator labor to realize specific goals within these spaces. In this work, we ask: “What are the promises the social media industry make about local community groups, and how do they position volunteer moderators to help realize those promises?” Through a qualitative content analysis of 849 documents produced by Facebook, Reddit, and NextDoor, we trace how platforms position their version of local community as slightly different utopian spaces, and channel volunteer moderator labor both through direct instruction and appeals to civic virtue.
more »
« less
- Award ID(s):
- 2207836
- PAR ID:
- 10652741
- Publisher / Repository:
- Association of Internet Researchers
- Date Published:
- Journal Name:
- AoIR Selected Papers of Internet Research
- ISSN:
- 2162-3317
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
All That’s Happening behind the Scenes: Putting the Spotlight on Volunteer Moderator Labor in RedditOnline volunteers are an uncompensated yet valuable labor force for many social platforms. For example, volunteer content moderators perform a vast amount of labor to maintain online communities. However, as social platforms like Reddit favor revenue generation and user engagement, moderators are under-supported to manage the expansion of online communities. To preserve these online communities, developers and researchers of social platforms must account for and support as much of this labor as possible. In this paper, we quantitatively characterize the publicly visible and invisible actions taken by moderators on Reddit, using a unique dataset of private moderator logs for 126 subreddits and over 900 moderators. Our analysis of this dataset reveals the heterogeneity of moderation work across both communities and moderators. Moreover, we find that analyzing only visible work – the dominant way that moderation work has been studied thus far – drastically underestimates the amount of human moderation labor on a subreddit. We discuss the implications of our results on content moderation research and social platforms.more » « less
-
As content moderation becomes a central aspect of all social media platforms and online communities, interest has grown in how to make moderation decisions contestable. On social media platforms where individual communities moderate their own activities, the responsibility to address user appeals falls on volunteers from within the community. While there is a growing body of work devoted to understanding and supporting the volunteer moderators' workload, little is known about their practice of handling user appeals. Through a collaborative and iterative design process with Reddit moderators, we found that moderators spend considerable effort in investigating user ban appeals and desired to directly engage with users and retain their agency over each decision. To fulfill their needs, we designed and built AppealMod, a system that induces friction in the appeals process by asking users to provide additional information before their appeals are reviewed by human moderators. In addition to giving moderators more information, we expected the friction in the appeal process would lead to a selection effect among users, with many insincere and toxic appeals being abandoned before getting any attention from human moderators. To evaluate our system, we conducted a randomized field experiment in a Reddit community of over 29 million users that lasted for four months. As a result of the selection effect, moderators viewed only 30% of initial appeals and less than 10% of the toxically worded appeals; yet they granted roughly the same number of appeals when compared with the control group. Overall, our system is effective at reducing moderator workload and minimizing their exposure to toxic content while honoring their preference for direct engagement and agency in appeals.more » « less
-
null (Ed.)Content moderation is a critical service performed by a variety of people on social media, protecting users from offensive or harmful content by reviewing and removing either the content or the perpetrator. These moderators fall into one of two categories: employees or volunteers. Prior research has suggested that there are differences in the effectiveness of these two types of moderators, with the more transparent user-based moderation being useful for educating users. However, direct comparisons between commercially-moderated and user-moderated platforms are rare, and apart from the difference in transparency, we still know little about what other disparities in user experience these two moderator types may create. To explore this, we conducted cross-platform surveys of over 900 users of commercially-moderated (Facebook, Instagram, Twitter, and YouTube) and user-moderated (Reddit and Twitch) social media platforms. Our results indicated that although user-moderated platforms did seem to be more transparent than commercially-moderated ones, this did not lead to user-moderated platforms being perceived as less toxic. In addition, commercially-moderated platform users want companies to take more responsibility for content moderation than they currently do, while user-moderated platform users want designated moderators and those who post on the site to take more responsibility. Across platforms, users seem to feel powerless and want to be taken care of when it comes to content moderation as opposed to engaging themselves.more » « less
-
This design project arose with the purpose to intervene within the current landscape of content moderation. Our team’s primary focus is community moderators, specifically volunteer moderators for online community spaces. Community moderators play a key role in up-keeping the guidelines and culture of online community spaces, as well as managing and protecting community members against harmful content online. Yet, community moderators notably lack the official resources and training that their commercial moderator counterparts have. To address this, we present ModeratorHub, a knowledge sharing platform that focuses on community moderation. In our current design stage, we focused 2 features: (1) moderation case documentation and (2) moderation case sharing. These are our team’s initial building blocks of a larger intervention aimed to support moderators and promote social support and collaboration among end users of online community ecosystems.more » « less
An official website of the United States government

