skip to main content

Search for: All records

Creators/Authors contains: "Wohn, Donghee Yvette"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Digital patronage is an emergent revenue model in which fans provide recurring financial support to a content creator, as exemplified through platforms like Twitch and Patreon. Whereas previous research has investigated creator-supporter relationships, the current study investigates creators’ multi-platform practices through in-depth interviews. We build on trends in creative labor studies and communication to examine how creators perceive Patreon and integrate it into their existing workflow. This study’s findings contribute to a better understanding of the role of digital patronage within the broader ecosystem of creative labor platforms.

    Free, publicly-accessible full text available May 22, 2023
  2. The digital patronage model provides content creators the opportunity to receive sustained financial support directly from their fans. Patreon is a popular digital patronage platform that represents a prime site for the study of creators’ relational labor with their fans. Through in-depth interviews with 21 Patreon creators, this study investigated different types of creator–patron relationships and the perceived benefits and challenges of carrying out relational labor. We found that creators construct a variety of relationships with patrons, ranging from purely transactional to intimately familial. Creators benefit from relational labor in that it encourages patrons to treat the creator as a person rather than a product, resulting in both financial and emotional support. However, creators face difficulties in maintaining appropriate relational boundaries with patrons, some of whom control a substantial part of a creator’s income.
  3. Volunteer moderators actively engage in online content management, such as removing toxic content and sanctioning anti-normative behaviors in user-governed communities. The synchronicity and ephemerality of live-streaming communities pose unique moderation challenges. Based on interviews with 21 volunteer moderators on Twitch, we mapped out 13 moderation strategies and presented them in relation to the bad act, enabling us to categorize from proactive and reactive perspectives and identify communicative and technical interventions. We found that the act of moderation involves highly visible and performative activities in the chat and invisible activities involving coordination and sanction. The juxtaposition of real-time individual decision-making with collaborative discussions and the dual nature of visible and invisible activities of moderators provide a unique lens into a role that relies heavily on both the social and technical. We also discuss how the affordances of live-streaming contribute to these unique activities.
  4. Live streaming is a form of interactive media that potentially makes streamers more vulnerable to harassment due to the unique attributes of the technology that facilitates enhanced information sharing via video and audio. In this study, we document the harassment experiences of 25 live streamers on Twitch from underrepresented groups including women and/or LGBTQ streamers and investigate how they handle and prevent adversity. In particular, live streaming enables streamers to self-moderate their communities, so we delve into the methods of how they manage their communities from both a social and technical perspective. We found that technology can cover the basics for handling negativity, but much emotional and relational work is invested in moderation, community maintenance, and self-care.
  5. Rules and norms are critical to community governance. Live streaming communities like Twitch consist of thousands of micro-communities called channels. We conducted two studies to understand the micro-community rules. Study one suggests that Twitch users perceive that both rules transparency and communication frequency matter to channel vibe and frequency of harassment. Study two finds that the most popular channels have no channel or chat rules; among these having rules, rules encouraged by streamers are prominent. We explain why this may happen and how this contributes to community moderation and future research.
  6. Content moderation is a critical service performed by a variety of people on social media, protecting users from offensive or harmful content by reviewing and removing either the content or the perpetrator. These moderators fall into one of two categories: employees or volunteers. Prior research has suggested that there are differences in the effectiveness of these two types of moderators, with the more transparent user-based moderation being useful for educating users. However, direct comparisons between commercially-moderated and user-moderated platforms are rare, and apart from the difference in transparency, we still know little about what other disparities in user experience these two moderator types may create. To explore this, we conducted cross-platform surveys of over 900 users of commercially-moderated (Facebook, Instagram, Twitter, and YouTube) and user-moderated (Reddit and Twitch) social media platforms. Our results indicated that although user-moderated platforms did seem to be more transparent than commercially-moderated ones, this did not lead to user-moderated platforms being perceived as less toxic. In addition, commercially-moderated platform users want companies to take more responsibility for content moderation than they currently do, while user-moderated platform users want designated moderators and those who post on the site to take more responsibility. Across platforms, users seem tomore »feel powerless and want to be taken care of when it comes to content moderation as opposed to engaging themselves.« less
  7. Live streaming is a unique form of media that creates a direct line of interaction between streamers and viewers. While previous research has explored the social motivations of those who stream and watch streams in the gaming community, there is a lack of research that investigates intimate self-disclosure in this context, such as discussing sensitive topics like mental health on platforms such as This study aims to explore discussions about mental health in gaming live streams to better understand how people perceive discussions of mental health in this new media context. The context of live streaming is particularly interesting as it facilitates social interactions that are masspersonal in nature: the streamer broadcasts to a larger, mostly unknown audience, but can also interact in a personal way with viewers. In this study, we interviewed Twitch viewers about the streamers they view, how and to what extent they discuss mental health on their channels in relation to gaming, how other viewers reacted to these discussions, and what they think about live streams, gaming-focused or otherwise, as a medium for mental health discussions. Through these interviews, our team was able to establish a baseline of user perception of mental health in gamingmore »communities on Twitch that extends our understanding of how social media and live streaming can be used for mental health conversations. Our first research question unraveled that mental health discussions happen in a variety of ways on Twitch, including during gaming streams, Just Chatting talks, and through the stream chat. Our second research question showed that streamers handle mental health conversations on their channels in a variety of ways. These depend on how they have built their channel, which subsequently impacts how viewers perceive mental health. Lastly, we learned that viewers’ reactions to mental health discussions depend on their motivations for watching the stream such as learning about the game, being entertained, and more. We found that more discussions about mental health on Twitch led to some viewers being more cautious when talking about mental health to show understanding.« less
  8. Online harassment has becoming an unavoidable issue and many people are trying to find methods to mitigate online harassment. In this study, we did a systematic review of online harassment interventions. We focused on studies that proposed online mechanisms and designed experiments to test the corresponding effects. We collected 17 studies from scholarly databases which met our criteria. Among these studies, we categorized the interventions into 7 groups based on the theoretical or design-related mechanism they were using to justify the intervention. At the end of the study, we critically reviewed these studies and proposed some ideas for future research.