skip to main content


Title: Measuring the Monetary Value of Online Volunteer Work
Online volunteers are a crucial labor force that keeps many for-profit systems afloat (e.g. social media platforms and online review sites). Despite their substantial role in upholding highly valuable technological systems, online volunteers have no way of knowing the value of their work. This paper uses content moderation as a case study and measures its monetary value to make apparent volunteer labor’s value. Using a novel dataset of private logs generated by moderators, we use linear mixed-effect regression and estimate that Reddit moderators worked a minimum of 466 hours per day in 2020. These hours are worth 3.4 million USD based on the median hourly wage for comparable content moderation services in the U.S. We discuss how this information may inform pathways to alleviate the one-sided relationship between technology companies and online volunteers.  more » « less
Award ID(s):
1815507
NSF-PAR ID:
10404665
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Proceedings of the International AAAI Conference on Web and Social Media
Volume:
16
ISSN:
2162-3449
Page Range / eLocation ID:
596 to 606
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Online volunteers are an uncompensated yet valuable labor force for many social platforms. For example, volunteer content moderators perform a vast amount of labor to maintain online communities. However, as social platforms like Reddit favor revenue generation and user engagement, moderators are under-supported to manage the expansion of online communities. To preserve these online communities, developers and researchers of social platforms must account for and support as much of this labor as possible. In this paper, we quantitatively characterize the publicly visible and invisible actions taken by moderators on Reddit, using a unique dataset of private moderator logs for 126 subreddits and over 900 moderators. Our analysis of this dataset reveals the heterogeneity of moderation work across both communities and moderators. Moreover, we find that analyzing only visible work – the dominant way that moderation work has been studied thus far – drastically underestimates the amount of human moderation labor on a subreddit. We discuss the implications of our results on content moderation research and social platforms. 
    more » « less
  2. Due to challenges around low-quality comments and misinformation, many news outlets have opted to turn off commenting features on their websites. The New York Times (NYT), on the other hand, has continued to scale up its online discussion resources to reach large audiences. Through interviews with the NYT moderation team, we present examples of how moderators manage the first ~24 hours of online discussion after a story breaks, while balancing concerns about journalistic credibility. We discuss how managing comments at the NYT is not merely a matter of content regulation, but can involve reporting from the "community beat" to recognize emerging topics and synthesize the multiple perspectives in a discussion to promote community. We discuss how other news organizations---including those lacking moderation resources---might appropriate the strategies and decisions offered by the NYT. Future research should investigate strategies to share and update the information generated about topics in the news through the course of content moderation. 
    more » « less
  3. null (Ed.)
    Content moderation is a critical service performed by a variety of people on social media, protecting users from offensive or harmful content by reviewing and removing either the content or the perpetrator. These moderators fall into one of two categories: employees or volunteers. Prior research has suggested that there are differences in the effectiveness of these two types of moderators, with the more transparent user-based moderation being useful for educating users. However, direct comparisons between commercially-moderated and user-moderated platforms are rare, and apart from the difference in transparency, we still know little about what other disparities in user experience these two moderator types may create. To explore this, we conducted cross-platform surveys of over 900 users of commercially-moderated (Facebook, Instagram, Twitter, and YouTube) and user-moderated (Reddit and Twitch) social media platforms. Our results indicated that although user-moderated platforms did seem to be more transparent than commercially-moderated ones, this did not lead to user-moderated platforms being perceived as less toxic. In addition, commercially-moderated platform users want companies to take more responsibility for content moderation than they currently do, while user-moderated platform users want designated moderators and those who post on the site to take more responsibility. Across platforms, users seem to feel powerless and want to be taken care of when it comes to content moderation as opposed to engaging themselves. 
    more » « less
  4. Most social media platforms implement content moderation to address interpersonal harms such as harassment. Content moderation relies on offender-centered, punitive approaches, e.g., bans and content removal. We consider an alternative justice framework, restorative justice, which aids victims in healing, supports offenders in repairing the harm, and engages community members in addressing the harm collectively. To assess the utility of restorative justice in addressing online harm, we interviewed 23 users from Overwatch gaming communities, including moderators, victims, and offenders; such communities are particularly susceptible to harm, with nearly three quarters of all online game players suffering from some form of online abuse. We study how the communities currently handle harm cases through the lens of restorative justice and examine their attitudes toward implementing restorative justice processes. Our analysis reveals that cultural, technical, and resource-related obstacles hinder implementation of restorative justice within the existing punitive framework despite online community needs and existing structures to support it. We discuss how current content moderation systems can embed restorative justice goals and practices and overcome these challenges. 
    more » « less
  5. Content moderation is a crucial aspect of online platforms, and it requires human moderators (mods) to repeatedly review and remove harmful content. However, this moderation process can lead to cognitive overload and emotional labor for the mods. As new platforms and designs emerge, such as live streaming space, new challenges arise due to the real-time nature of the interactions. In this study, we examined the use of ignoring as a moderation strategy by interviewing 19 Twitch mods. Our findings indicated that ignoring involves complex cognitive processes and significant invisible labor in the decision-making process. Additionally, we found that ignoring is an essential component of real-time moderation. These preliminary findings suggest that ignoring has the potential to be a valuable moderation strategy in future interactive systems, which highlights the need to design better support for ignoring in interactive live-streaming systems. 
    more » « less