Twitch is one of the largest live streaming platforms and is unique from other social media in that it supports synchronous interaction and enables users to engage in moderation of the content through varied technical tools, which include auto-moderation tools provided by Twitch, third-party applications, and home-brew apps. The authors interviewed 21 moderators on Twitch and categorized the current features of real-time moderation tools they are using into four functions (chat control, content control, viewer control, settings control) and explored some new features of tools that they wish to own (e.g., grouping chat by languages, pop out window to hold messages, chat slow down, a set of buttons with pre-written/pre-message content, viewer activity tracking, all in one). Design implications provide suggestions for chatbots and algorithm design and development.
Moderation Visibility: Mapping the Strategies of Volunteer Moderators in Live Streaming Micro Communities
Volunteer moderators actively engage in online content management, such as removing toxic content and sanctioning anti-normative behaviors in user-governed communities. The synchronicity and ephemerality of live-streaming communities pose unique moderation challenges. Based on interviews with 21 volunteer moderators on Twitch, we mapped out 13 moderation strategies and presented them in relation to the bad act, enabling us to categorize from proactive and reactive perspectives and identify communicative and technical interventions. We found that the act of moderation involves highly visible and performative activities in the chat and invisible activities involving coordination and sanction. The juxtaposition of real-time individual decision-making with collaborative discussions and the dual nature of visible and invisible activities of moderators provide a unique lens into a role that relies heavily on both the social and technical. We also discuss how the affordances of live-streaming contribute to these unique activities.
- Award ID(s):
- Publication Date:
- NSF-PAR ID:
- Journal Name:
- IMX '21: ACM International Conference on Interactive Media Experiences
- Page Range or eLocation-ID:
- 61 - 72
- Sponsoring Org:
- National Science Foundation
More Like this
Volunteer Moderators in Twitch Micro Communities: How They Get Involved, the Roles They Play, and the Emotional Labor They ExperienceThe ability to engage in real-time text conversations is an important feature on live streaming platforms. The moderation of this text content relies heavily on the work of unpaid volunteers. This study reports on interviews with 20 people who moderate for Twitch micro communities, defined as channels that are built around a single or group of streamers, rather than the broadcast of an event. The study identifies how people become moderators, their different styles of moderating, and the difficulties that come with the job. In addition to the hardships of dealing with negative content, moderators also have complex interpersonal relationships with the streamers and viewers, where the boundaries between emotional labor, physical labor, and fun are intertwined.
Live streaming is a form of interactive media that potentially makes streamers more vulnerable to harassment due to the unique attributes of the technology that facilitates enhanced information sharing via video and audio. In this study, we document the harassment experiences of 25 live streamers on Twitch from underrepresented groups including women and/or LGBTQ streamers and investigate how they handle and prevent adversity. In particular, live streaming enables streamers to self-moderate their communities, so we delve into the methods of how they manage their communities from both a social and technical perspective. We found that technology can cover the basics for handling negativity, but much emotional and relational work is invested in moderation, community maintenance, and self-care.
Pride and Professionalization in Volunteer Moderation: Lessons for Effective Platform-User CollaborationWhile most moderation actions on major social platforms are performed by either the platforms themselves or volunteer moderators, it is rare for platforms to collaborate directly with moderators to address problems. This paper examines how the group-chatting platform Discord coordinated with experienced volunteer moderators to respond to hate and harassment toward LGBTQ+ communities during Pride Month, June 2021, in what came to be known as the "Pride Mod" initiative. Representatives from Discord and volunteer moderators collaboratively identified and communicated with targeted communities, and volunteers temporarily joined servers that requested support to supplement those servers' existing volunteer moderation teams. Though LGBTQ+ communities were subject to a wave of targeted hate during Pride Month, the communities that received the requested volunteer support reported having a better capacity to handle the issues that arose. This paper reports the results of interviews with 11 moderators who participated in the initiative as well as the Discord employee who coordinated it. We show how this initiative was made possible by the way Discord has cultivated trust and built formal connections with its most active volunteers, and discuss the ethical implications of formal collaborations between for-profit platforms and volunteer users.
Adopting new technology is challenging for volunteer moderation teams of online communities. Challenges are aggravated when communities increase in size. In a prior qualitative study, Kiene et al. found evidence that moderator teams adapted to challenges by relying on their experience in other technological platforms to guide the creation and adoption of innovative custom moderation "bots." In this study, we test three hypotheses on the social correlates of user innovated bot usage drawn from a previous qualitative study. We find strong evidence of the proposed relationship between community size and the use of user innovated bots. Although previous work suggests that smaller teams of moderators will be more likely to use these bots and that users with experience moderating in the previous platform will be more likely to do so, we find little evidence in support of either proposition.