skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Wohn, Donghee Yvette"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. This design project arose with the purpose to intervene within the current landscape of content moderation. Our team’s primary focus is community moderators, specifically volunteer moderators for online community spaces. Community moderators play a key role in up-keeping the guidelines and culture of online community spaces, as well as managing and protecting community members against harmful content online. Yet, community moderators notably lack the official resources and training that their commercial moderator counterparts have. To address this, we present ModeratorHub, a knowledge sharing platform that focuses on community moderation. In our current design stage, we focused 2 features: (1) moderation case documentation and (2) moderation case sharing. These are our team’s initial building blocks of a larger intervention aimed to support moderators and promote social support and collaboration among end users of online community ecosystems. 
    more » « less
  2. Social media users may perceive moderation decisions by the platform differently, which can lead to frustration and dropout. This study investigates users’ perceived justice and fairness of online moderation decisions when they are exposed to various illegal versus legal scenarios, retributive versus restorative moderation strategies, and user-moderated versus commercially moderated platforms. We conduct an online experiment on 200 American social media users of Reddit and Twitter. Results show that retributive moderation delivers higher justice and fairness for commercially moderated than for user-moderated platforms in illegal violations; restorative moderation delivers higher fairness for legal violations than illegal ones. We discuss the opportunities for platform policymaking to improve moderation system design. 
    more » « less
  3. Online harassment and content moderation have been well-documented in online communities. However, new contexts and systems always bring new ways of harassment and need new moderation mechanisms. This study focuses on hate raids, a form of group attack in real-time in live streaming communities. Through a qualitative analysis of hate raids discussion in the Twitch subreddit (r/Twitch), we found that (1) hate raids as a human-bot coordinated group attack leverages the live stream system to attack marginalized streamers and other potential groups with(out) breaking the rules, (2) marginalized streamers suffer compound harms with insufficient support from the platform, (3) moderation strategies are overwhelmingly technical, but streamers still struggle to balance moderation and participation considering their marginalization status and needs. We use affordances as a lens to explain how hate raids happens in live streaming systems and propose moderation-by-design as a lens when developing new features or systems to mitigate the potential abuse of such designs. 
    more » « less
  4. As each micro community centered around the streamer attempts to set its own guidelines in live streaming communities, it is common for volunteer moderators (mods) and the streamer to disagree on how to handle various situations. In this study, we conducted an online survey (N=240) with live streaming mods to explore their commitment to the streamer to grow the micro community and the different styles in which they handle conflicts with the streamer. We found that 1) mods apply more active and cooperative styles than passive and assertive styles to manage conflicts, but they might be forced to do so, and 2) mods with strong commitments to the streamer would like to apply styles showing either high concerns for the streamer or low concerns for themselves. We reflect on how these results can affect micro community development and recommend designs to mitigate conflict and strengthen commitment. 
    more » « less
  5. Content moderation is a crucial aspect of online platforms, and it requires human moderators (mods) to repeatedly review and remove harmful content. However, this moderation process can lead to cognitive overload and emotional labor for the mods. As new platforms and designs emerge, such as live streaming space, new challenges arise due to the real-time nature of the interactions. In this study, we examined the use of ignoring as a moderation strategy by interviewing 19 Twitch mods. Our findings indicated that ignoring involves complex cognitive processes and significant invisible labor in the decision-making process. Additionally, we found that ignoring is an essential component of real-time moderation. These preliminary findings suggest that ignoring has the potential to be a valuable moderation strategy in future interactive systems, which highlights the need to design better support for ignoring in interactive live-streaming systems. 
    more » « less
  6. When people have the freedom to create and post content on the internet, particularly anonymously, they do not always respect the rules and regulations of the websites on which they post, leaving other unsuspecting users vulnerable to sexism, racism, threats, and other unacceptable content in their daily cyberspace diet. However, content moderators witness the worst of humanity on a daily basis in place of the average netizen. This takes its toll on moderators, causing stress, fatigue, and emotional distress akin to the symptomology of post-traumatic stress disorder (PTSD). The goal of the present study was to explore whether adding positive stimuli to breaktimes-images of baby animals or beautiful, aweinspiring landscapes-could help reduce the negative side-effects of being a content moderator. To test this, we had over 300 experienced content moderators read and decide whether 200 fake text-based social media posts were acceptable or not for public consumption. Although we set out to test positive emotional stimulation, however, we actually found that it is the cumulative nature of the negative emotions that likely negates most of the effects of the intervention: the longer the person had practiced content moderation, the stronger their negative experience. Connections to compassion fatigue and how best to spend work breaks as a content moderator are discussed. 
    more » « less
  7. Volunteer moderators (mods) play significant roles in developing moderation standards and dealing with harmful content in their micro-communities. However, little work explores how volunteer mods work as a team. In line with prior work about understanding volunteer moderation, we interview 40 volunteer mods on Twitch — a leading live streaming platform. We identify how mods collaborate on tasks (off-streaming coordination and preparation, in-stream real-time collaboration, and relationship building both off-stream and in-stream to reinforce collaboration) and how mods contribute to moderation standards (collaboratively working on the community rulebook and individually shaping community norms). We uncover how volunteer mods work as an effective team. We also discuss how the affordances of multi-modal communication and informality of volunteer moderation contribute to task collaboration, standards development, and mod’s roles and responsibilities. 
    more » « less
  8. Content moderation is an essential part of online community health and governance. While much of extant research is centered on what happens to the content, moderation also involves the management of violators. This study focuses on how moderators (mods) make decisions about their actions after the violation takes place but before the sanction by examining how they "profile" the violators. Through observations and interviews with volunteer mods on Twitch, we found that mods engage in a complex process of collaborative evidence collection and profile violators into different categories to decide the type and extent of punishment. Mods consider violators' characteristics as well as behavioral history and violation context before taking moderation action. The main purpose of the profiling was to avoid excessive punishment and aim to integrate violators more into the community. We discuss the contributions of profiling to moderation practice and suggest design mechanisms to facilitate mods' profiling processes. 
    more » « less
  9. null (Ed.)
    The digital patronage model provides content creators the opportunity to receive sustained financial support directly from their fans. Patreon is a popular digital patronage platform that represents a prime site for the study of creators’ relational labor with their fans. Through in-depth interviews with 21 Patreon creators, this study investigated different types of creator–patron relationships and the perceived benefits and challenges of carrying out relational labor. We found that creators construct a variety of relationships with patrons, ranging from purely transactional to intimately familial. Creators benefit from relational labor in that it encourages patrons to treat the creator as a person rather than a product, resulting in both financial and emotional support. However, creators face difficulties in maintaining appropriate relational boundaries with patrons, some of whom control a substantial part of a creator’s income. 
    more » « less
  10. Live streaming is a form of media that allows streamers to directly interact with their audience. Previous research has explored mental health, Twitch.tv and live streaming platforms, and users' social motivations behind watching live streams separately. However, few have explored how these all intertwine in conversations involving intimate, self-disclosing topics, such as mental health. Live streams are unique in that they are largely masspersonal in nature; streamers broadcast themselves to mostly unknown viewers, but may choose to interact with them in a personal way. This study aims to understand users' motivations, preferences, and habits behind participating in mental health discussions on live streams. We interviewed 25 Twitch viewers about the streamers they watch, how they interact in mental health discussions, and how they believe streamers should discuss mental health on live streams. Our findings are contextualized in the dynamics in which these discussions occur. Overall, we found that the innate design of the Twitch platform promotes a user-hierarchy in the ecosystem of streamers and their communities, which may affect how mental health is discussed. 
    more » « less