skip to main content


Title: What are Effective Strategies of Handling Harassment on Twitch?: Users' Perspectives
Harassment is an issue in online communities with the live streaming platform Twitch being no exception. In this study, we surveyed 375 Twitch users in person at TwitchCon, asking them about who should be responsible for deciding what should be allowed and what strategies they perceived to be effective in handling harassment. We found that users thought that streamers should be most responsible for enforcing rules and that either blocking bad actors, ignoring them, or trying to educate them were the most effective strategies.  more » « less
Award ID(s):
1841354
NSF-PAR ID:
10178899
Author(s) / Creator(s):
;
Date Published:
Journal Name:
CSCW '19: Conference Companion Publication of the 2019 on Computer Supported Cooperative Work and Social Computing
Page Range / eLocation ID:
166 to 170
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Rules and norms are critical to community governance. Live streaming communities like Twitch consist of thousands of micro-communities called channels. We conducted two studies to understand the micro-community rules. Study one suggests that Twitch users perceive that both rules transparency and communication frequency matter to channel vibe and frequency of harassment. Study two finds that the most popular channels have no channel or chat rules; among these having rules, rules encouraged by streamers are prominent. We explain why this may happen and how this contributes to community moderation and future research. 
    more » « less
  2. Introduction Social media has created opportunities for children to gather social support online (Blackwell et al., 2016; Gonzales, 2017; Jackson, Bailey, & Foucault Welles, 2018; Khasawneh, Rogers, Bertrand, Madathil, & Gramopadhye, 2019; Ponathil, Agnisarman, Khasawneh, Narasimha, & Madathil, 2017). However, social media also has the potential to expose children and adolescents to undesirable behaviors. Research showed that social media can be used to harass, discriminate (Fritz & Gonzales, 2018), dox (Wood, Rose, & Thompson, 2018), and socially disenfranchise children (Page, Wisniewski, Knijnenburg, & Namara, 2018). Other research proposes that social media use might be correlated to the significant increase in suicide rates and depressive symptoms among children and adolescents in the past ten years (Mitchell, Wells, Priebe, & Ybarra, 2014). Evidence based research suggests that suicidal and unwanted behaviors can be promulgated through social contagion effects, which model, normalize, and reinforce self-harming behavior (Hilton, 2017). These harmful behaviors and social contagion effects may occur more frequently through repetitive exposure and modelling via social media, especially when such content goes “viral” (Hilton, 2017). One example of viral self-harming behavior that has generated significant media attention is the Blue Whale Challenge (BWC). The hearsay about this challenge is that individuals at all ages are persuaded to participate in self-harm and eventually kill themselves (Mukhra, Baryah, Krishan, & Kanchan, 2017). Research is needed specifically concerning BWC ethical concerns, the effects the game may have on teenagers, and potential governmental interventions. To address this gap in the literature, the current study uses qualitative and content analysis research techniques to illustrate the risk of self-harm and suicide contagion through the portrayal of BWC on YouTube and Twitter Posts. The purpose of this study is to analyze the portrayal of BWC on YouTube and Twitter in order to identify the themes that are presented on YouTube and Twitter posts that share and discuss BWC. In addition, we want to explore to what extent are YouTube videos compliant with safe and effective suicide messaging guidelines proposed by the Suicide Prevention Resource Center (SPRC). Method Two social media websites were used to gather the data: 60 videos and 1,112 comments from YouTube and 150 posts from Twitter. The common themes of the YouTube videos, comments on those videos, and the Twitter posts were identified using grounded, thematic content analysis on the collected data (Padgett, 2001). Three codebooks were built, one for each type of data. The data for each site were analyzed, and the common themes were identified. A deductive coding analysis was conducted on the YouTube videos based on the nine SPRC safe and effective messaging guidelines (Suicide Prevention Resource Center, 2006). The analysis explored the number of videos that violated these guidelines and which guidelines were violated the most. The inter-rater reliabilities between the coders ranged from 0.61 – 0.81 based on Cohen’s kappa. Then the coders conducted consensus coding. Results & Findings Three common themes were identified among all the posts in the three social media platforms included in this study. The first theme included posts where social media users were trying to raise awareness and warning parents about this dangerous phenomenon in order to reduce the risk of any potential participation in BWC. This was the most common theme in the videos and posts. Additionally, the posts claimed that there are more than 100 people who have played BWC worldwide and provided detailed description of what each individual did while playing the game. These videos also described the tasks and different names of the game. Only few videos provided recommendations to teenagers who might be playing or thinking of playing the game and fewer videos mentioned that the provided statistics were not confirmed by reliable sources. The second theme included posts of people that either criticized the teenagers who participated in BWC or made fun of them for a couple of reasons: they agreed with the purpose of BWC of “cleaning the society of people with mental issues,” or they misunderstood why teenagers participate in these kind of challenges, such as thinking they mainly participate due to peer pressure or to “show off”. The last theme we identified was that most of these users tend to speak in detail about someone who already participated in BWC. These videos and posts provided information about their demographics and interviews with their parents or acquaintances, who also provide more details about the participant’s personal life. The evaluation of the videos based on the SPRC safe messaging guidelines showed that 37% of the YouTube videos met fewer than 3 of the 9 safe messaging guidelines. Around 50% of them met only 4 to 6 of the guidelines, while the remaining 13% met 7 or more of the guidelines. Discussion This study is the first to systematically investigate the quality, portrayal, and reach of BWC on social media. Based on our findings from the emerging themes and the evaluation of the SPRC safe messaging guidelines we suggest that these videos could contribute to the spread of these deadly challenges (or suicide in general since the game might be a hoax) instead of raising awareness. Our suggestion is parallel with similar studies conducted on the portrait of suicide in traditional media (Fekete & Macsai, 1990; Fekete & Schmidtke, 1995). Most posts on social media romanticized people who have died by following this challenge, and younger vulnerable teens may see the victims as role models, leading them to end their lives in the same way (Fekete & Schmidtke, 1995). The videos presented statistics about the number of suicides believed to be related to this challenge in a way that made suicide seem common (Cialdini, 2003). In addition, the videos presented extensive personal information about the people who have died by suicide while playing the BWC. These videos also provided detailed descriptions of the final task, including pictures of self-harm, material that may encourage vulnerable teens to consider ending their lives and provide them with methods on how to do so (Fekete & Macsai, 1990). On the other hand, these videos both failed to emphasize prevention by highlighting effective treatments for mental health problems and failed to encourage teenagers with mental health problems to seek help and providing information on where to find it. YouTube and Twitter are capable of influencing a large number of teenagers (Khasawneh, Ponathil, Firat Ozkan, & Chalil Madathil, 2018; Pater & Mynatt, 2017). We suggest that it is urgent to monitor social media posts related to BWC and similar self-harm challenges (e.g., the Momo Challenge). Additionally, the SPRC should properly educate social media users, particularly those with more influence (e.g., celebrities) on elements that boost negative contagion effects. While the veracity of these challenges is doubted by some, posting about the challenges in unsafe manners can contribute to contagion regardless of the challlenges’ true nature. 
    more » « less
  3. Online harassment refers to a wide range of harmful behaviors, including hate speech, insults, doxxing, and non-consensual image sharing. Social media platforms have developed complex processes to try to detect and manage content that may violate community guidelines; however, less work has examined the types of harms associated with online harassment or preferred remedies to that harassment. We conducted three online surveys with US adult Internet users measuring perceived harms and preferred remedies associated with online harassment. Study 1 found greater perceived harm associated with non-consensual photo sharing, doxxing, and reputational damage compared to other types of harassment. Study 2 found greater perceived harm with repeated harassment compared to one-time harassment, but no difference between individual and group harassment. Study 3 found variance in remedy preferences by harassment type; for example, banning users is rated highly in general, but is rated lower for non-consensual photo sharing and doxxing compared to harassing family and friends and damaging reputation. Our findings highlight that remedies should be responsive to harassment type and potential for harm. Remedies are also not necessarily correlated with harassment severity—expanding remedies may allow for more contextually appropriate and effective responses to harassment. 
    more » « less
  4. Live streaming is a form of media that allows streamers to directly interact with their audience. Previous research has explored mental health, Twitch.tv and live streaming platforms, and users' social motivations behind watching live streams separately. However, few have explored how these all intertwine in conversations involving intimate, self-disclosing topics, such as mental health. Live streams are unique in that they are largely masspersonal in nature; streamers broadcast themselves to mostly unknown viewers, but may choose to interact with them in a personal way. This study aims to understand users' motivations, preferences, and habits behind participating in mental health discussions on live streams. We interviewed 25 Twitch viewers about the streamers they watch, how they interact in mental health discussions, and how they believe streamers should discuss mental health on live streams. Our findings are contextualized in the dynamics in which these discussions occur. Overall, we found that the innate design of the Twitch platform promotes a user-hierarchy in the ecosystem of streamers and their communities, which may affect how mental health is discussed. 
    more » « less
  5. null (Ed.)
    Present day ideals of good parenting are socio-technical constructs formed at the intersection of medical best practices, cultural norms, and technical innovation. These ideals take shape in relation to the fundamental uncertainty that parents/mothers face, an uncertainty that comes from not knowing how to do what is best for one's children, families, and selves. The growing body of parent-focused smart devices and data-tracking platforms emerging from this intersection frame the responsible parent as one who evaluates, analyzes, and mitigates data-defined risks for their children and family. As these devices and platforms proliferate, whether from respected medical institutions or commercial interests, they place new demands on families and add an implicit emphasis on how humans (often mothers) can be augmented and improved by data-rich technology. This is expressed both in the actions they support (e.g., breastfeeding, monitoring food intake), as well as in the emotions they render marginal (e.g., rage, struggle, loss, and regret). In this article, we turn away from optimization and self-improvement narratives to attend to our own felt experiences as mothers and designers. Through an embodied practice of creating Design Memoirs, we speak directly to the HCI community from our position as both users and subjects of optimized parenting tools. Our goal in this work is to bring nuance to a domain that is often rendered in simplistic terms or frames mothers as figures who could endlessly do more for the sake of their families. Our Design Memoirs emphasize the conflicting and often negative emotions we experienced while navigating these tools and medical systems. They depict our feelings of being at once powerful and powerless, expressing rage and love simultaneously, and struggling between expressing pride and humility. The Design Memoirs serve us in advocating that designers should use caution when considering a problem/solution focus to the experiences of parents. We conclude by reflecting on how our shared practice of making memoirs, as well as other approaches within feminist and queer theory, suggest strategies that trouble these optimization and improvement narratives. Overall, we present a case for designing for mothers who feel like they are just making do or falling short, in order to provide relief from the anxiety of constantly seeking improvement. 
    more » « less