skip to main content


Title: Coordination and Collaboration: How do Volunteer Moderators Work as a Team in Live Streaming Communities?
Volunteer moderators (mods) play significant roles in developing moderation standards and dealing with harmful content in their micro-communities. However, little work explores how volunteer mods work as a team. In line with prior work about understanding volunteer moderation, we interview 40 volunteer mods on Twitch — a leading live streaming platform. We identify how mods collaborate on tasks (off-streaming coordination and preparation, in-stream real-time collaboration, and relationship building both off-stream and in-stream to reinforce collaboration) and how mods contribute to moderation standards (collaboratively working on the community rulebook and individually shaping community norms). We uncover how volunteer mods work as an effective team. We also discuss how the affordances of multi-modal communication and informality of volunteer moderation contribute to task collaboration, standards development, and mod’s roles and responsibilities.  more » « less
Award ID(s):
1928627
NSF-PAR ID:
10383981
Author(s) / Creator(s):
;
Date Published:
Journal Name:
CHI '22: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems
Page Range / eLocation ID:
1 to 14
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Content moderation is an essential part of online community health and governance. While much of extant research is centered on what happens to the content, moderation also involves the management of violators. This study focuses on how moderators (mods) make decisions about their actions after the violation takes place but before the sanction by examining how they "profile" the violators. Through observations and interviews with volunteer mods on Twitch, we found that mods engage in a complex process of collaborative evidence collection and profile violators into different categories to decide the type and extent of punishment. Mods consider violators' characteristics as well as behavioral history and violation context before taking moderation action. The main purpose of the profiling was to avoid excessive punishment and aim to integrate violators more into the community. We discuss the contributions of profiling to moderation practice and suggest design mechanisms to facilitate mods' profiling processes. 
    more » « less
  2. As each micro community centered around the streamer attempts to set its own guidelines in live streaming communities, it is common for volunteer moderators (mods) and the streamer to disagree on how to handle various situations. In this study, we conducted an online survey (N=240) with live streaming mods to explore their commitment to the streamer to grow the micro community and the different styles in which they handle conflicts with the streamer. We found that 1) mods apply more active and cooperative styles than passive and assertive styles to manage conflicts, but they might be forced to do so, and 2) mods with strong commitments to the streamer would like to apply styles showing either high concerns for the streamer or low concerns for themselves. We reflect on how these results can affect micro community development and recommend designs to mitigate conflict and strengthen commitment. 
    more » « less
  3. null (Ed.)
    Volunteer moderators actively engage in online content management, such as removing toxic content and sanctioning anti-normative behaviors in user-governed communities. The synchronicity and ephemerality of live-streaming communities pose unique moderation challenges. Based on interviews with 21 volunteer moderators on Twitch, we mapped out 13 moderation strategies and presented them in relation to the bad act, enabling us to categorize from proactive and reactive perspectives and identify communicative and technical interventions. We found that the act of moderation involves highly visible and performative activities in the chat and invisible activities involving coordination and sanction. The juxtaposition of real-time individual decision-making with collaborative discussions and the dual nature of visible and invisible activities of moderators provide a unique lens into a role that relies heavily on both the social and technical. We also discuss how the affordances of live-streaming contribute to these unique activities. 
    more » « less
  4. Modern social media platforms like Twitch, YouTube, etc., embody an open space for content creation and consumption. However, an unintended consequence of such content democratization is the proliferation of toxicity and abuse that content creators get subjected to. Commercial and volunteer content moderators play an indispensable role in identifying bad actors and minimizing the scale and degree of harmful content. Moderation tasks are often laborious, complex, and even if semi-automated, they involve high-consequence human decisions that affect the safety and popular perception of the platforms. In this paper, through an interdisciplinary collaboration among researchers from social science, human-computer interaction, and visualization, we present a systematic understanding of how visual analytics can help in human-in-the-loop content moderation. We contribute a characterization of the data-driven problems and needs for proactive moderation and present a mapping between the needs and visual analytic tasks through a task abstraction framework. We discuss how the task abstraction framework can be used for transparent moderation, design interventions for moderators’ well-being, and ultimately, for creating futuristic human-machine interfaces for data-driven content moderation. 
    more » « less
  5. Content moderation is a crucial aspect of online platforms, and it requires human moderators (mods) to repeatedly review and remove harmful content. However, this moderation process can lead to cognitive overload and emotional labor for the mods. As new platforms and designs emerge, such as live streaming space, new challenges arise due to the real-time nature of the interactions. In this study, we examined the use of ignoring as a moderation strategy by interviewing 19 Twitch mods. Our findings indicated that ignoring involves complex cognitive processes and significant invisible labor in the decision-making process. Additionally, we found that ignoring is an essential component of real-time moderation. These preliminary findings suggest that ignoring has the potential to be a valuable moderation strategy in future interactive systems, which highlights the need to design better support for ignoring in interactive live-streaming systems. 
    more » « less