skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on June 3, 2025

Title: Trans-centered moderation: Trans technology creators and centering transness in platform and community governance
Mainstream platforms’ content moderation systems typically employ generalized “one-size-fits-all” approaches, intended to serve both general and marginalized users. Thus, transgender people must often create their own technologies and moderation systems to meet their specific needs. In our interview study of transgender technology creators (n=115), we found that creators face issues of transphobic abuse and disproportionate content moderation. Trans tech creators address these issues by carefully moderating and vetting their userbases, centering trans contexts in content moderation systems, and employing collective governance and community models. Based on these findings, we argue that trans tech creators’ approaches to moderation offer important insights into how to better design for trans users, and ultimately, marginalized users in the larger platform ecology. We introduce the concept of trans-centered moderation – content moderation that reviews and successfully vets transphobic users, appoints trans moderators to effectively moderate trans contexts, considers the limitations and constraints of technology for addressing social challenges, and employs collective governance and community models. Trans-centered moderation can help to improve platform design for trans users while reducing the harm faced by trans people and marginalized users more broadly.  more » « less
Award ID(s):
2210841 1942125
PAR ID:
10532701
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
ACM
Date Published:
ISBN:
9798400704505
Page Range / eLocation ID:
326 to 336
Format(s):
Medium: X
Location:
Rio de Janeiro Brazil
Sponsoring Org:
National Science Foundation
More Like this
  1. Trans technology – technology created to help address challenges that trans people face – is an important area for innovation that can help improve marginalized people’s lives. We conducted 104 interviews with 115 creators of trans technology to understand how they involved trans people and communities in design processes. We describe projects that used human-centered design processes, as well as design processes that involved trans people in smaller ways, including gathering feedback from users, conducting user testing, or the creators being trans themselves. We show how involving trans people and communities in design is vital for trans technologies to realize their potential for addressing trans needs. Yet we highlight a frequent gap between trans technology design and deployment, and discuss ways to bridge this gap. We argue for the importance of involving community in trans technology design to ensure that trans technology achieves its promise of helping address trans needs and challenges. 
    more » « less
  2. Extended reality (XR) technologies are becoming increasingly pervasive, and may have capacity to help marginalized groups such as transgender people. Drawing from interviews with n = 18 creators of trans technology, we examined how XR technologies do and can support trans people. We uncovered a number of creative ways that XR technologies support trans experiences. Trans technology creators are designing augmented reality (AR) and virtual reality (VR) systems that help people explore trans identity, experience new types of bodies, educate about and display trans stories and curated trans content, manipulate the physical world, and innovate gender-affirming surgical techniques. Additionally, we show how considering XR as an analogy for trans identity helps us to think about the fluidity and fluctuation inherent in trans identity in new ways, which in turn enables envisioning technologies that can better support complex and changing identities. Despite XR’s potential for supporting trans people, current AR and VR systems face limitations that restrict their large-scale use, but as access to XR systems increase, so will their capacity to improve trans lives. 
    more » « less
  3. Social media users have long been aware of opaque content moderation systems and how they shape platform environments. On TikTok, creators increasingly utilize algospeak to circumvent unjust content restriction, meaning, they change or invent words to prevent TikTok’s content moderation algorithm from banning their video (e.g., “le$bean” for “lesbian”). We interviewed 19 TikTok creators about their motivations and practices of using algospeak in relation to their experience with TikTok’s content moderation. Participants largely anticipated how TikTok’s algorithm would read their videos, and used algospeak to evade unjustified content moderation while simultaneously ensuring target audiences can still find their videos. We identify non-contextuality, randomness, inaccuracy, and bias against marginalized communities as major issues regarding freedom of expression, equality of subjects, and support for communities of interest. Using algospeak, we argue for a need to improve contextually informed content moderation to valorize marginalized and tabooed audiovisual content on social media. 
    more » « less
  4. Transparency matters a lot to people who experience moderation on online platforms; much CSCW research has viewed offering explanations as one of the primary solutions to enhance moderation transparency. However, relatively little attention has been paid to unpacking what transparency entails in moderation design, especially for content creators. We interviewed 28 YouTubers to understand their moderation experiences and analyze the dimensions of moderation transparency. We identified four primary dimensions: participants desired the moderation system to present moderation decisions saliently, explain the decisions profoundly, afford communication with the users effectively, and offer repairment and learning opportunities. We discuss how these four dimensions are mutually constitutive and conditioned in the context of creator moderation, where the target of governance mechanisms extends beyond the content to creator careers. We then elaborate on how a dynamic, transparency perspective could value content creators' digital labor, how transparency design could support creators' learning, as well as implications for transparency design of other creator platforms. 
    more » « less
  5. The increasing harms caused by hate, harassment, and other forms of abuse online have motivated major platforms to explore hierarchical governance. The idea is to allow communities to have designated members take on moderation and leadership duties; meanwhile, members can still escalate issues to the platform. But these promising approaches have only been explored in plaintext settings where community content is public to the platform. It is unclear how one can realize hierarchical governance in the huge and increasing number of online communities that utilize end-to-end encrypted (E2EE) messaging for privacy. We propose the design of private, hierarchical governance systems. These should enable similar levels of community governance as in plaintext settings, while maintaining cryptographic privacy of content and governance actions not reported to the platform. We design the first such system, taking a layered approach that adds governance logic on top of an encrypted messaging protocol; we show how an extension to the message layer security (MLS) protocol suffices for achieving a rich set of governance policies. Our approach allows developers to rapidly prototype new governance features, taking inspiration from a plaintext system called PolicyKit. We report on an initial prototype encrypted messaging system called MlsGov that supports content-based community and platform moderation, elections of community moderators, votes to remove abusive users, and more. 
    more » « less