Users face various privacy risks in smart homes, yet there are limited ways for them to learn about the details of such risks, such as the data practices of smart home devices and their data flow. In this paper, we present Privacy Plumber, a system that enables a user to inspect and explore the privacy "leaks" in their home using an augmented reality tool. Privacy Plumber allows the user to learn and understand the volume of data leaving the home and how that data may affect a user's privacy -- in the same physical context as the devices in question, because we visualize the privacy leaks with augmented reality. Privacy Plumber uses ARP spoofing to gather aggregate network traffic information and presents it through an overlay on top of the device in an smartphone app. The increased transparency aims to help the user make privacy decisions and mend potential privacy leaks, such as instruct Privacy Plumber on what devices to block, on what schedule (i.e., turn off Alexa when sleeping), etc. Our initial user study with six participants demonstrates participants' increased awareness of privacy leaks in smart devices, which further contributes to their privacy decisions (e.g., which devices to block).
more »
« less
DEVELOPING A PRIVACY-AWARE MAP-BASED CROSS-PLATFORM SOCIAL MEDIA DASHBOARD FOR MUNICIPAL DECISION-MAKING
Abstract. Social media (SM) has become a principal information source and the vast amounts of generated data are increasingly used to inform various disciplines. Most platforms, such as Instagram, Twitter or Flickr, offer the option to tag a post with a location or a precise coordinate, which also fosters applications of data in the geospatial fields. Notwithstanding the many ways in which these data could be analyzed and applied, scandals such as Cambridge Analytica have also shown the risks to user privacy that seem inherently part of the data.Is it possible to mitigate these risks, while maintaining the collective usability of this data for society questions? We identify urban planning as a key field for socio-spatial justice and propose an open source map-based cross-platform dashboard, fueled by geospatial SM, as a supporting tool for municipal decision-makers and citizens alike. As a core part of this tool, we implement a novel privacy-aware data structure that allows for both, a more transparent, encompassing data ground for municipalities, and a reduced data collection footprint, preventing the misuse of data or compromising user privacy.
more »
« less
- Award ID(s):
- 1737581
- PAR ID:
- 10355129
- Date Published:
- Journal Name:
- The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences
- Volume:
- XLVIII-4/W1-2022
- ISSN:
- 2194-9034
- Page Range / eLocation ID:
- 545 to 552
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
The integration of connected autonomous vehicles (CAVs) has significantly enhanced driving convenience, but it has also raised serious privacy concerns, particularly regarding the personal identifiable information (PII) stored on infotainment systems. Recent advances in connected and autonomous vehicle control, such as multi-agent system (MAS)-based hierarchical architectures and privacy-preserving strategies for mixed-autonomy platoon control, underscore the increasing complexity of privacy management within these environments. Rental cars with infotainment systems pose substantial challenges, as renters often fail to delete their data, leaving it accessible to subsequent renters. This study investigates the risks associated with PII in connected vehicles and emphasizes the necessity of automated solutions to ensure data privacy. We introduce the Vehicle Inactive Profile Remover (VIPR), an innovative automated solution designed to identify and delete PII left on infotainment systems. The efficacy of VIPR is evaluated through surveys, hands-on experiments with rental vehicles, and a controlled laboratory environment. VIPR achieved a 99.5% success rate in removing user profiles, with an average deletion time of 4.8 s or less, demonstrating its effectiveness in mitigating privacy risks. This solution highlights VIPR as a critical tool for enhancing privacy in connected vehicle environments, promoting a safer, more responsible use of connected vehicle technology in society.more » « less
-
Mobile and web apps are increasingly relying on the data generated or provided by users such as from their uploaded documents and images. Unfortunately, those apps may raise significant user privacy concerns. Specifically, to train or adapt their models for accurately processing huge amounts of data continuously collected from millions of app users, app or service providers have widely adopted the approach of crowdsourcing for recruiting crowd workers to manually annotate or transcribe the sampled ever-changing user data. However, when users' data are uploaded through apps and then become widely accessible to hundreds of thousands of anonymous crowd workers, many human-in-the-loop related privacy questions arise concerning both the app user community and the crowd worker community. In this paper, we propose to investigate the privacy risks brought by this significant trend of large-scale crowd-powered processing of app users' data generated in their daily activities. We consider the representative case of receipt scanning apps that have millions of users, and focus on the corresponding receipt transcription tasks that appear popularly on crowdsourcing platforms. We design and conduct an app user survey study (n=108) to explore how app users perceive privacy in the context of using receipt scanning apps. We also design and conduct a crowd worker survey study (n=102) to explore crowd workers' experiences on receipt and other types of transcription tasks as well as their attitudes towards such tasks. Overall, we found that most app users and crowd workers expressed strong concerns about the potential privacy risks to receipt owners, and they also had a very high level of agreement with the need for protecting receipt owners' privacy. Our work provides insights on app users' potential privacy risks in crowdsourcing, and highlights the need and challenges for protecting third party users' privacy on crowdsourcing platforms. We have responsibly disclosed our findings to the related crowdsourcing platform and app providers.more » « less
-
Data privacy, a critical human right, is gaining importance as new technologies are developed, and the old ones evolve. In mobile platforms such as Android, data privacy regulations require developers to communicate data access requests using privacy policy statements (PPS). This case study cross-examines the PPS in popular social media (SM) apps---Facebook and Twitter---for features of language ambiguity, sensitive data requests, and whether the statements tally with the data requests made in the Manifest file. Subsequently, we conduct a comparative analysis between the PPS of these two apps to examine trends that may constitute a threat to user data privacy.more » « less
-
Mobile applications (apps) provide users valuable benefits at the risk of exposing users to privacy harms. Improving privacy in mobile apps faces several challenges, in particular, that many apps are developed by low resourced software development teams, such as end-user programmers or in startups. In addition, privacy risks are primarily known to users, which can make it difficult for developers to prioritize privacy for sensitive data. In this paper, we introduce a novel, lightweight method that allows app developers to elicit scenarios and privacy risk scores from users directly using only an app screenshot. The technique relies on named entity recognition (NER) to identify information types in user-authored scenarios, which are then fed in real-time to a privacy risk survey that users complete. The best-performing NER model predicts information types with a weighted average precision of 0.70 and recall of 0.72, after post-processing to remove false positives. The model was trained on a labeled 300-scenario corpus, and evaluated in an end-to-end evaluation using an additional 203 scenarios yielding 2,338 user-provided privacy risk scores. Finally, we discuss how developers can use the risk scores to prioritize, select and apply privacy design strategies in the context of four user-authored scenarios.more » « less
An official website of the United States government

