skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: A Just and Comprehensive Strategy for Using NLP to Address Online Abuse
Online abusive behavior affects millions and the NLP community has attempted to mitigate this problem by developing technologies to detect abuse. However, current methods have largely focused on a narrow definition of abuse to detriment of victims who seek both validation and solutions. In this position paper, we argue that the community needs to make three substantive changes: (1) expanding our scope of problems to tackle both more subtle and more serious forms of abuse, (2) developing proactive technologies that counter or inhibit abuse before it harms, and (3) reframing our effort within a framework of justice to promote healthy communities.  more » « less
Award ID(s):
1822228
PAR ID:
10128426
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019)
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Mainstream platforms’ content moderation systems typically employ generalized “one-size-fits-all” approaches, intended to serve both general and marginalized users. Thus, transgender people must often create their own technologies and moderation systems to meet their specific needs. In our interview study of transgender technology creators (n=115), we found that creators face issues of transphobic abuse and disproportionate content moderation. Trans tech creators address these issues by carefully moderating and vetting their userbases, centering trans contexts in content moderation systems, and employing collective governance and community models. Based on these findings, we argue that trans tech creators’ approaches to moderation offer important insights into how to better design for trans users, and ultimately, marginalized users in the larger platform ecology. We introduce the concept of trans-centered moderation – content moderation that reviews and successfully vets transphobic users, appoints trans moderators to effectively moderate trans contexts, considers the limitations and constraints of technology for addressing social challenges, and employs collective governance and community models. Trans-centered moderation can help to improve platform design for trans users while reducing the harm faced by trans people and marginalized users more broadly. 
    more » « less
  2. Inaccurate assumptions about people who abuse technology can inhibit effective socio-technical interventions for at-risk populations, including IPV survivors. Our study aims to rectify this concerning oversight through a synthesis of seven research projects on how 152 abusive partners (APs) discuss and understand their malicious use of technology in face-to-face interactions. AP accounts about technology abuse are rich sources of insight into technology abuse, but demonstrate a heterogeneity of awareness of, choice to use, and ability to desist from participating in technology abuse. To ensure immediate practical benefits for practitioner communities, we also engaged 20 facilitators of abusive partner intervention programs (APIPs) in focused group discussions to identify potential solutions for addressing technology abuse in their programming. Findings reveal that facilitators grapple with a complex set of challenges, stemming from the concern about teaching APs new abusive techniques in-session, and lacking professional tools to investigate, evaluate, and resolve technology abuse attacks. Our work provides valuable insights into addressing technology abuse in the APIP ecosystem, offering targeted lessons for the CSCW community and stakeholders in violence prevention. 
    more » « less
  3. Loader, B. (Ed.)
    When researchers invoke the term ‘last billion’ to refer to emerging ICT users, they often focus on network access as a ‘solution’ while neglecting important considerations such as local ownership or knowledge, both of which are essential to sustainable and empowering uses of these technologies in developing contexts. Research reveals that mere access to networks without active community involvement can fail to empower already marginalized and disenfranchized users. Building upon these findings, this article uses ethnographic methods to explore the meanings of ‘network sovereignty’ in rural, low-income communities in developing countries. It presents two case studies focused on local network initiatives in Oaxaca, Mexico and Bunda, Tanzania and then offers an assessment matrix to support future network sovereignty research based on five categories: community engagement; local cultures/ontologies; digital education and technological knowledge; economic ownership; and community empowerment. Our comparative research reveals that communities that are able to assert collective ownership over local infrastructure, embed network initiatives within local cultures, and prioritize digital education are much more likely to create and sustain local networks that support their economic, political, and cultural lives. 
    more » « less
  4. User reporting is an essential component of content moderation on many online platforms--in particular, on end-to-end encrypted (E2EE) messaging platforms where platform operators cannot proactively inspect message contents. However, users' privacy concerns when considering reporting may impede the effectiveness of this strategy in regulating online harassment. In this paper, we conduct interviews with 16 users of E2EE platforms to understand users' mental models of how reporting works and their resultant privacy concerns and considerations surrounding reporting. We find that users expect platforms to store rich longitudinal reporting datasets, recognizing both their promise for better abuse mitigation and the privacy risk that platforms may exploit or fail to protect them. We also find that users have preconceptions about the respective capabilities and risks of moderators at the platform versus community level--for instance, users trust platform moderators more to not abuse their power but think community moderators have more time to attend to reports. These considerations, along with perceived effectiveness of reporting and how to provide sufficient evidence while maintaining privacy, shape how users decide whether, to whom, and how much to report. We conclude with design implications for a more privacy-preserving reporting system on E2EE messaging platforms. 
    more » « less
  5. As interest in metadata-hiding communication grows in both research and practice, a need exists for stronger abuse reporting features on metadata-hiding platforms. While message franking has been deployed on major end-to-end encrypted platforms as a lightweight and effective abuse reporting feature, there is no comparable technique for metadata-hiding platforms. Existing efforts to support abuse reporting in this setting, such as asymmetric message franking or the Hecate scheme, require order of magnitude increases in client and server computation or fundamental changes to the architecture of messaging systems. As a result, while metadata-hiding communication inches closer to practice, critical content moderation concerns remain unaddressed. This paper demonstrates that, for broad classes of metadata-hiding schemes, lightweight abuse reporting can be deployed with minimal changes to the overall architecture of the system. Our insight is that much of the structure needed to support abuse reporting already exists in these schemes. By taking a non-generic approach, we can reuse this structure to achieve abuse reporting with minimal overhead. In particular, we show how to modify schemes based on secret sharing user inputs to support a message franking-style protocol. Compared to prior work, our shared franking technique more than halves the time to prepare a franked message and gives order of magnitude reductions in server-side message processing times, as well as in the time to decrypt a message and verify a report. 
    more » « less