skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on September 27, 2026

Title: Imago Obscura: An Image Privacy AI Co-pilot to Enable Identification and Mitigation of Risks
Users often struggle to navigate the privacy / publicity boundary in sharing images online: they may lack awareness of image privacy risks and/or the ability to apply effective mitigation strategies. To address this challenge, we introduce and evaluate Imago Obscura, an AI-powered, image-editing copilot that enables users to identify and mitigate privacy risks with images they intend to share. Driven by design requirements from a formative user study with 7 imageediting experts, Imago Obscura enables users to articulate their image-sharing intent and privacy concerns. The system uses these inputs to surface contextually pertinent privacy risks, and then recommends and facilitates application of a suite of obfuscation techniques found to be effective in prior literature — e.g., inpainting, blurring, and generative content replacement. We evaluated Imago Obscura with 15 end-users in a lab study and found that it greatly improved users’ awareness of image privacy risks and their ability to address those risks, allowing them to make more informed sharing decisions.  more » « less
Award ID(s):
2316294
PAR ID:
10659351
Author(s) / Creator(s):
; ;
Publisher / Repository:
ACM
Date Published:
Page Range / eLocation ID:
1 to 26
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Fitness trackers are an increasingly popular tool for tracking one’s health and physical activity. While research has evaluated the potential benefits of these devices for health and well-being, few studies have empirically evaluated users’ behaviors when sharing personal fitness information (PFI) and the privacy concerns that stem from the collection, aggregation, and sharing of PFI. In this study, we present findings from a survey of Fitbit and Jawbone users (N=361) to understand how concerns about privacy in general and user- generated data in particular affect users’ mental models of PFI privacy, tracking, and sharing. Findings highlight the complex relationship between users’ demographics, sharing behaviors, privacy concerns, and internet skills with how valuable and sensitive they rate their PFI. We conclude with a discussion of opportunities to increase user awareness of privacy and PFI. 
    more » « less
  2. Fitness trackers are an increasingly popular tool for tracking one’s health and physical activity. While research has evaluated the potential benefits of these devices for health and well-being, few studies have empirically evaluated users’ behaviors when sharing personal fitness information (PFI) and the privacy concerns that stem from the collection, aggregation, and sharing of PFI. In this study, we present findings from a survey of Fitbit and Jawbone users (N=361) to understand how concerns about privacy in general and user- generated data in particular affect users’ mental models of PFI privacy, tracking, and sharing. Findings highlight the complex relationship between users’ demographics, sharing behaviors, privacy concerns, and internet skills with how valuable and sensitive they rate their PFI. We conclude with a discussion of opportunities to increase user awareness of privacy and PFI. 
    more » « less
  3. How do practitioners who develop consumer AI products scope, motivate, and conduct privacy work? Respecting pri- vacy is a key principle for developing ethical, human-centered AI systems, but we cannot hope to better support practitioners without answers to that question. We interviewed 35 industry AI practitioners to bridge that gap. We found that practitioners viewed privacy as actions taken against pre-defined intrusions that can be exacerbated by the capabilities and requirements of AI, but few were aware of AI-specific privacy intrusions documented in prior literature. We found that their privacy work was rigidly defined and situated, guided by compliance with privacy regulations and policies, and generally demoti- vated beyond meeting minimum requirements. Finally, we found that the methods, tools, and resources they used in their privacy work generally did not help address the unique pri- vacy risks introduced or exacerbated by their use of AI in their products. Collectively, these findings reveal the need and opportunity to create tools, resources, and support structures to improve practitioners’ awareness of AI-specific privacy risks, motivations to do AI privacy work, and ability to ad- dress privacy harms introduced or exacerbated by their use of AI in consumer products. 
    more » « less
  4. Users’ perceptions of fitness tracking privacy is a subject of active study, but how do various aspects of social identity inform these perceptions? We conducted an online survey (N=322) that explores the influence of identity on fitness tracking privacy perceptions and practices, considering participants’ gender, race, age, and whether or not they identify as LGTBQ*. Participants reported how comfortable they felt sharing fitness data, commented on whether they believed their identity impacted this comfort, and brainstormed several data sharing risks and a possible mitigation for each risk. For each surveyed dimension of social identity, we find one or more reliable effects on participants’ level of comfort sharing fitness data, specifically when considering institutional groups like employers, insurers, and advertisers. Further, 64% of participants indicate at least one of their identity characteristics informs their comfort. We also find evidence that the perceived risks of sharing fitness data vary by identity, but do not find evidence of difference in the strategies used to manage these risks. This work highlights a path towards reasoning about the privacy challenges of fitness tracking with respect for the lived experiences of all users. 
    more » « less
  5. Social media applications have benefited users in several ways, including ease of communication and quick access to information. However, they have also introduced several privacy and safety risks. These risks are particularly concerning in the context of interpersonal attacks, which are carried out by abusive friends, family members, intimate partners, co-workers, or even strangers. Evidence shows interpersonal attackers regularly exploit social media platforms to harass and spy on their targets. To help protect targets from such attacks, social media platforms have introduced several privacy and safety features. However, it is unclear how effective they are against interpersonal threats. In this work, we analyzed ten popular social media applications, identifying 100 unique privacy and safety features that provide controls across eight categories: discoverability, visibility, saving and sharing, interaction, self-censorship, content moderation, transparency, and reporting. We simulated 59 different attack actions by a persistent attacker — aimed at account discovery, information gathering, non-consensual sharing, and harassment — and found many were successful. Based on our findings, we proposed improvements to mitigate these risks. 
    more » « less