This content will become publicly available on May 10, 2025
- Award ID(s):
- 2120497
- PAR ID:
- 10523198
- Publisher / Repository:
- IEEE Computer Society
- Date Published:
- Journal Name:
- Proceedings of the IEEE Symposium on Security and Privacy
- ISSN:
- 2375-1207
- Format(s):
- Medium: X
- Location:
- Los Alamitos, CA, USA
- Sponsoring Org:
- National Science Foundation
More Like this
-
User reporting is an essential component of content moderation on many online platforms--in particular, on end-to-end encrypted (E2EE) messaging platforms where platform operators cannot proactively inspect message contents. However, users' privacy concerns when considering reporting may impede the effectiveness of this strategy in regulating online harassment. In this paper, we conduct interviews with 16 users of E2EE platforms to understand users' mental models of how reporting works and their resultant privacy concerns and considerations surrounding reporting. We find that users expect platforms to store rich longitudinal reporting datasets, recognizing both their promise for better abuse mitigation and the privacy risk that platforms may exploit or fail to protect them. We also find that users have preconceptions about the respective capabilities and risks of moderators at the platform versus community level--for instance, users trust platform moderators more to not abuse their power but think community moderators have more time to attend to reports. These considerations, along with perceived effectiveness of reporting and how to provide sufficient evidence while maintaining privacy, shape how users decide whether, to whom, and how much to report. We conclude with design implications for a more privacy-preserving reporting system on E2EE messaging platforms.more » « less
-
This design project arose with the purpose to intervene within the current landscape of content moderation. Our team’s primary focus is community moderators, specifically volunteer moderators for online community spaces. Community moderators play a key role in up-keeping the guidelines and culture of online community spaces, as well as managing and protecting community members against harmful content online. Yet, community moderators notably lack the official resources and training that their commercial moderator counterparts have. To address this, we present ModeratorHub, a knowledge sharing platform that focuses on community moderation. In our current design stage, we focused 2 features: (1) moderation case documentation and (2) moderation case sharing. These are our team’s initial building blocks of a larger intervention aimed to support moderators and promote social support and collaboration among end users of online community ecosystems.more » « less
-
Many online communities rely on postpublication moderation where contributors-even those that are perceived as being risky-are allowed to publish material immediately and where moderation takes place after the fact. An alternative arrangement involves moderating content before publication. A range of communities have argued against prepublication moderation by suggesting that it makes contributing less enjoyable for new members and that it will distract established community members with extra moderation work. We present an empirical analysis of the effects of a prepublication moderation system called FlaggedRevs that was deployed by several Wikipedia language editions. We used panel data from 17 large Wikipedia editions to test a series of hypotheses related to the effect of the system on activity levels and contribution quality. We found that the system was very effective at keeping low-quality contributions from ever becoming visible. Although there is some evidence that the system discouraged participation among users without accounts, our analysis suggests that the system's effects on contribution volume and quality were moderate at most. Our findings imply that concerns regarding the major negative effects of prepublication moderation systems on contribution quality and project productivity may be overstated.
-
In this paper, we propose a secure lightweight and thing-centered IoT communication system based on MQTT, SecT, in which a device/thing authenticates users. Compared with a server-centered IoT system in which a cloud server authenticates users, a thing-centered system preserves user privacy since the cloud server is primarily a relay between things and users and does not store or see user data in plaintext. The contributions of this work are three-fold. First, we explicitly identify critical functionalities in bootstrapping a thing and design secure pairing and binding strategies. Second, we design a strategy of end-to-end encrypted communication between users and things for the sake of user privacy and even the server cannot see the communication content in plaintext. Third, we design a strong authentication system that can defeat known device scanning attack, brute force attack and device spoofing attack against IoT. We implemented a prototype of SecT on a $10 Raspberry Pi Zero W and performed extensive experiments to validate its performance. The experiment results show that SecT is both cost-effective and practical. Although we design SecT for the smart home application, it can be easily extended to other IoT application domains.more » « less
-
Most social media platforms implement content moderation to address interpersonal harms such as harassment. Content moderation relies on offender-centered, punitive approaches, e.g., bans and content removal. We consider an alternative justice framework, restorative justice, which aids victims in healing, supports offenders in repairing the harm, and engages community members in addressing the harm collectively. To assess the utility of restorative justice in addressing online harm, we interviewed 23 users from Overwatch gaming communities, including moderators, victims, and offenders; such communities are particularly susceptible to harm, with nearly three quarters of all online game players suffering from some form of online abuse. We study how the communities currently handle harm cases through the lens of restorative justice and examine their attitudes toward implementing restorative justice processes. Our analysis reveals that cultural, technical, and resource-related obstacles hinder implementation of restorative justice within the existing punitive framework despite online community needs and existing structures to support it. We discuss how current content moderation systems can embed restorative justice goals and practices and overcome these challenges.more » « less