skip to main content

Title: Volunteer Moderators in Twitch Micro Communities: How They Get Involved, the Roles They Play, and the Emotional Labor They Experience
The ability to engage in real-time text conversations is an important feature on live streaming platforms. The moderation of this text content relies heavily on the work of unpaid volunteers. This study reports on interviews with 20 people who moderate for Twitch micro communities, defined as channels that are built around a single or group of streamers, rather than the broadcast of an event. The study identifies how people become moderators, their different styles of moderating, and the difficulties that come with the job. In addition to the hardships of dealing with negative content, moderators also have complex interpersonal relationships with the streamers and viewers, where the boundaries between emotional labor, physical labor, and fun are intertwined.  more » « less
Award ID(s):
Author(s) / Creator(s):
Date Published:
Journal Name:
Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems
Page Range / eLocation ID:
No. 160
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. When people have the freedom to create and post content on the internet, particularly anonymously, they do not always respect the rules and regulations of the websites on which they post, leaving other unsuspecting users vulnerable to sexism, racism, threats, and other unacceptable content in their daily cyberspace diet. However, content moderators witness the worst of humanity on a daily basis in place of the average netizen. This takes its toll on moderators, causing stress, fatigue, and emotional distress akin to the symptomology of post-traumatic stress disorder (PTSD). The goal of the present study was to explore whether adding positive stimuli to breaktimes-images of baby animals or beautiful, aweinspiring landscapes-could help reduce the negative side-effects of being a content moderator. To test this, we had over 300 experienced content moderators read and decide whether 200 fake text-based social media posts were acceptable or not for public consumption. Although we set out to test positive emotional stimulation, however, we actually found that it is the cumulative nature of the negative emotions that likely negates most of the effects of the intervention: the longer the person had practiced content moderation, the stronger their negative experience. Connections to compassion fatigue and how best to spend work breaks as a content moderator are discussed. 
    more » « less
  2. Online volunteers are an uncompensated yet valuable labor force for many social platforms. For example, volunteer content moderators perform a vast amount of labor to maintain online communities. However, as social platforms like Reddit favor revenue generation and user engagement, moderators are under-supported to manage the expansion of online communities. To preserve these online communities, developers and researchers of social platforms must account for and support as much of this labor as possible. In this paper, we quantitatively characterize the publicly visible and invisible actions taken by moderators on Reddit, using a unique dataset of private moderator logs for 126 subreddits and over 900 moderators. Our analysis of this dataset reveals the heterogeneity of moderation work across both communities and moderators. Moreover, we find that analyzing only visible work – the dominant way that moderation work has been studied thus far – drastically underestimates the amount of human moderation labor on a subreddit. We discuss the implications of our results on content moderation research and social platforms. 
    more » « less
  3. Online volunteers are a crucial labor force that keeps many for-profit systems afloat (e.g. social media platforms and online review sites). Despite their substantial role in upholding highly valuable technological systems, online volunteers have no way of knowing the value of their work. This paper uses content moderation as a case study and measures its monetary value to make apparent volunteer labor’s value. Using a novel dataset of private logs generated by moderators, we use linear mixed-effect regression and estimate that Reddit moderators worked a minimum of 466 hours per day in 2020. These hours are worth 3.4 million USD based on the median hourly wage for comparable content moderation services in the U.S. We discuss how this information may inform pathways to alleviate the one-sided relationship between technology companies and online volunteers. 
    more » « less
  4. Content moderation is a crucial aspect of online platforms, and it requires human moderators (mods) to repeatedly review and remove harmful content. However, this moderation process can lead to cognitive overload and emotional labor for the mods. As new platforms and designs emerge, such as live streaming space, new challenges arise due to the real-time nature of the interactions. In this study, we examined the use of ignoring as a moderation strategy by interviewing 19 Twitch mods. Our findings indicated that ignoring involves complex cognitive processes and significant invisible labor in the decision-making process. Additionally, we found that ignoring is an essential component of real-time moderation. These preliminary findings suggest that ignoring has the potential to be a valuable moderation strategy in future interactive systems, which highlights the need to design better support for ignoring in interactive live-streaming systems. 
    more » « less
  5. Much of our modern digital infrastructure relies critically upon open sourced software. The communities responsible for building this cyberinfrastructure require maintenance and moderation, which is often supported by volunteer efforts. Moderation, as a non-technical form of labor, is a necessary but often overlooked task that maintainers undertake to sustain the community around an OSS project. This study examines the various structures and norms that support community moderation, describes the strategies moderators use to mitigate conflicts, and assesses how bots can play a role in assisting these processes. We interviewed 14 practitioners to uncover existing moderation practices and ways that automation can provide assistance. Our main contributions include a characterization of moderated content in OSS projects, moderation techniques, as well as perceptions of and recommendations for improving the automation of moderation tasks. We hope that these findings will inform the implementation of more effective moderation practices in open source communities.

    more » « less