skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Award ID contains: 1942125

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Queer users of Douyin, the Chinese version of TikTok, suspect that the platform removes and suppresses queer content, thus reducing queer visibility. In this study, we examined how Chinese queer users recognize and react to Douyin’s moderation of queer content by conducting interviews with 21 queer China-based Douyin content creators and viewers. Findings indicate that queer users actively explore and adapt to the platform’s underlying moderation logic. They employ creative content and posting strategies to reduce the likelihood of their expressions of queer topics and identities being removed or suppressed. Like Western platforms, Douyin’s moderation approaches are often ambiguous; but unlike Western platforms, queer users sometimes receive clarity on moderation reasons via direct communication with moderators. Participants suggested that Douyin’s repressive moderation practices are in!uenced by more than just platform policies and procedures – they also re!ect state-led homophobia and societal discipline. This study underscores the challenges Chinese queer communities face in maintaining online visibility and suggests that meaningful change in their experiences is unlikely without broader societal shifts towards queer acceptance. 
    more » « less
    Free, publicly-accessible full text available April 25, 2026
  2. Content creators with marginalized identities are disproportionately affected by shadowbanning on social media platforms, which impacts their economic prospects online. Through a diary study and interviews with eight marginalized content creators who are women, pole dancers, plus size, and/or LGBTQIA+, this paper examines how content creators with marginalized identities experience shadowbanning. We highlight the labor and economic inequalities of shadowbanning, and the resulting invisible online labor that marginalized creators often must perform. We identify three types of invisible labor that marginalized content creators engage in to mitigate shadowbanning and sustain their online presence: mental and emotional labor, misdirected labor, and community labor. We conclude that even though marginalized content creators engaged in cross-platform collaborative labor and personal mental/emotional labor to mitigate the impacts of shadowbanning, it was insufficient to prevent uncertainty and economic precarity created by algorithmic opacity and ambiguity. 
    more » « less
    Free, publicly-accessible full text available January 10, 2026
  3. Mainstream platforms’ content moderation systems typically employ generalized “one-size-fits-all” approaches, intended to serve both general and marginalized users. Thus, transgender people must often create their own technologies and moderation systems to meet their specific needs. In our interview study of transgender technology creators (n=115), we found that creators face issues of transphobic abuse and disproportionate content moderation. Trans tech creators address these issues by carefully moderating and vetting their userbases, centering trans contexts in content moderation systems, and employing collective governance and community models. Based on these findings, we argue that trans tech creators’ approaches to moderation offer important insights into how to better design for trans users, and ultimately, marginalized users in the larger platform ecology. We introduce the concept of trans-centered moderation – content moderation that reviews and successfully vets transphobic users, appoints trans moderators to effectively moderate trans contexts, considers the limitations and constraints of technology for addressing social challenges, and employs collective governance and community models. Trans-centered moderation can help to improve platform design for trans users while reducing the harm faced by trans people and marginalized users more broadly. 
    more » « less
    Free, publicly-accessible full text available June 3, 2025
  4. Transgender and nonbinary social media users experience disproportionate content removals on social media platforms, even when content does not violate platforms’ guidelines. In 2022, the Oversight Board, which oversees Meta platforms’ content moderation decisions, invited public feedback on Instagram’s removal of two trans users’ posts featuring their bare chests, introducing a unique opportunity to hear trans users’ feedback on how nudity and sexual activity policies impacted them. We conducted a qualitative analysis of 83 comments made public during the Oversight Board’s public comment process. Commenters criticized Meta’s nudity policies as enforcing a cisnormative view of gender while making it unclear how images of trans users’ bodies are moderated, enabling the disproportionate removal of trans content and limiting trans users’ ability to use Meta’s platforms. Yet there was significant divergence among commenters about how to address cisnormative moderation. Some commenters suggested that Meta clarify nudity guidelines, while others suggested that Meta overhaul them entirely, removing gendered distinctions or fundamentally reconfiguring the platform’s relationship to sexual content. We then discuss how the Oversight Board’s public comment process demonstrates the value of incorporating trans people’s feedback while developing policies related to gender and nudity, while arguing that Meta must go beyond only revising policy language by reevaluating how cisnormative values are encoded in all aspects of its content moderation systems. 
    more » « less
    Free, publicly-accessible full text available June 3, 2025
  5. Social media users create folk theories to help explain how elements of social media operate. Marginalized social media users face disproportionate content moderation and removal on social media platforms. We conducted a qualitative interview study (n = 24) to understand how marginalized social media users may create folk theories in response to content moderation and their perceptions of platforms’ spirit, and how these theories may relate to their marginalized identities. We found that marginalized social media users develop folk theories informed by their perceptions of platforms’ spirit to explain instances where their content was moderated in ways that violate their perceptions of how content moderation should work in practice. These folk theories typically address content being removed despite not violating community guidelines, along with bias against marginalized users embedded in guidelines. We provide implications for platforms, such as using marginalized users’ folk theories as tools to identify elements of platform moderation systems that function incorrectly and disproportionately impact marginalized users. 
    more » « less
  6. Research suggests that marginalized social media users face disproportionate content moderation and removal. However, when content is removed or accounts suspended, the processes governing content moderation are largely invisible, making assessing content moderation bias difficult. To study this bias, we conducted a digital ethnography of marginalized users on Reddit’s /r/FTM subreddit and Twitch’s “Just Chatting” and “Pools, Hot Tubs, and Beaches” categories, observing content moderation visibility in real time. We found that on Reddit, a text-based platform, platform tools make content moderation practices invisible to users, but moderators make their practices visible through communication with users. Yet on Twitch, a live chat and streaming platform, content moderation practices are visible in channel live chats, “unban appeal” streams, and “back from my ban” streams. Our ethnography shows how content moderation visibility differs in important ways between social media platforms, harming those who must see offensive content, and at other times, allowing for increased platform accountability. 
    more » « less