Privacy scholarship has shown how norms of appropriate information flow and information regulatory processes vary according to environment, which change as the environment changes, including through the introduction of new technologies. This paper describes findings from a qualitative research study that examines practices and perceptions of privacy in Cambodia as the population rapidly moves into an online environment (specifically Facebook, the most popular Internet tool in Cambodia today). We empirically demonstrate how the concept of privacy differs across cultures and show how the Facebook platform, as it becomes popular worldwide, catalyzes change in norms of information regulation. We discuss how the localization of transnational technology platforms provides a key site in which to investigate changing cultural ideas about privacy, and to discover misalignments between different expectations for information flow. Finally, we explore ways that insufficient localization effort by transnational technology companies puts some of the most marginalized users at disproportionate information disclosure risk when using new Internet tools, and offer some pragmatic suggestions for how such companies could improve privacy tools for users who are far -geographically or culturally - from where the tools are designed.
more »
« less
How Language Formality in Security and Privacy Interfaces Impacts Intended Compliance
Strong end-user security practices benefit both the user and hosting platform, but it is not well understood how companies communicate with their users to encourage these practices. This paper explores whether web companies and their platforms use different levels of language formality in these communications and tests the hypothesis that higher language formality leads to users’ increased intention to comply. We contribute a dataset and systematic analysis of 1,817 English language strings in web security and privacy interfaces across 13 web platforms, showing strong variations in language. An online study with 512 participants further demonstrated that people perceive differences in the language formality across platforms and that a higher language formality is associated with higher self-reported intention to comply. Our findings suggest that formality can be an important factor in designing effective security and privacy prompts. We discuss implications of these results, including how to balance formality with platform language style. In addition to being the first piece of work to analyze language formality in user security, these findings provide valuable insights into how platforms can best communicate with users about account security.
more »
« less
- Award ID(s):
- 2006104
- PAR ID:
- 10423708
- Date Published:
- Journal Name:
- Human factors in computing systems
- ISSN:
- 1062-9432
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
In recent years, gig work platforms have gained popularity as a way for individuals to earn money; as of 2021, 16% of Americans have at some point earned money from such platforms. Despite their popularity and their history of unfair data collection practices and worker safety, little is known about the data collected from workers (and users) by gig platforms and about the privacy dark pattern designs present in their apps. This paper presents an empirical measurement of 16 gig work platforms' data practices in the U.S. We analyze what data is collected by these platforms, and how it is shared and used. Finally, we consider how these practices constitute privacy dark patterns. To that end, we develop a novel combination of methods to address gig-worker-specific challenges in experimentation and data collection, enabling the largest in-depth study of such platforms to date. We find extensive data collection and sharing with 60 third parties—including sharing reversible hashes of worker Social Security Numbers (SSNs)—along with dark patterns that subject workers to greater privacy risk and opportunistically use collected data to nag workers in off-platform messages. We conclude this paper with proposed interdisciplinary mitigations for improving gig worker privacy protections. After we disclosed our SSN-related findings to affected platforms, the platforms confirmed that the issue had been mitigated. This is consistent with our independent audit of the affected platforms. Analysis code and redacted datasets will be made available to those who wish to reproduce our findings.more » « less
-
Mobile and web apps are increasingly relying on the data generated or provided by users such as from their uploaded documents and images. Unfortunately, those apps may raise significant user privacy concerns. Specifically, to train or adapt their models for accurately processing huge amounts of data continuously collected from millions of app users, app or service providers have widely adopted the approach of crowdsourcing for recruiting crowd workers to manually annotate or transcribe the sampled ever-changing user data. However, when users' data are uploaded through apps and then become widely accessible to hundreds of thousands of anonymous crowd workers, many human-in-the-loop related privacy questions arise concerning both the app user community and the crowd worker community. In this paper, we propose to investigate the privacy risks brought by this significant trend of large-scale crowd-powered processing of app users' data generated in their daily activities. We consider the representative case of receipt scanning apps that have millions of users, and focus on the corresponding receipt transcription tasks that appear popularly on crowdsourcing platforms. We design and conduct an app user survey study (n=108) to explore how app users perceive privacy in the context of using receipt scanning apps. We also design and conduct a crowd worker survey study (n=102) to explore crowd workers' experiences on receipt and other types of transcription tasks as well as their attitudes towards such tasks. Overall, we found that most app users and crowd workers expressed strong concerns about the potential privacy risks to receipt owners, and they also had a very high level of agreement with the need for protecting receipt owners' privacy. Our work provides insights on app users' potential privacy risks in crowdsourcing, and highlights the need and challenges for protecting third party users' privacy on crowdsourcing platforms. We have responsibly disclosed our findings to the related crowdsourcing platform and app providers.more » « less
-
The cross-platform application-development paradigm alleviates a major challenge of native application development, namely the need to re-implement the codebase for each target platform, and streamlines the deployment of applications to different platforms. Essentially, cross-platform application development relies on migrating web application code and repackaging it as a native application. In other words, code that was designed and developed to execute within the confines of a browser, with all the security checks and safeguards that that entails, is now deployed within a completely different execution environment. In this paper, we explore the inherent security and privacy risks that arise from this migration, due to the fundamental differences between these two execution environments, which we refer to as security lacunae. To that end, we establish a differential analysis workflow and develop a set of customized tests designed to uncover divergent behaviors of web code executed within a browser and as an Electron cross-platform application. Guided by the findings from our empirical exploration, we retrofit part of the Web Platform Tests (WPTs) testing suite so as to apply to the Electron framework, and systematically assess mechanisms that relate to isolation and access control, and critical security policies and headers. Our research uncovers semantic gaps that exist between the two execution environments, which affect the enforcement of critical security mechanisms, thus exposing users to severe risks. This can lead to privacy issues such as the exposure of sensitive data over unencrypted connections or unregulated third-party access to the local filesystem, and security issues such as the incorrect enforcement of CSP script execution directives. We demonstrate that directly migrating web application code to a cross-platform application, without refactoring the code and implementing additional safeguards to address the conceptual and behavioral mismatches between the two execution environments, can significantly affect the application's security and privacy posture.more » « less
-
Children’s and adolescents’ online data privacy are regulated by laws such as the Children’s Online Privacy Protection Act (COPPA) and the California Consumer Privacy Act (CCPA). Online services that are directed towards general audiences (i.e., including children, adolescents, and adults) must comply with these laws. In this paper, first, we present DiffAudit, a platform-agnostic privacy auditing methodology for general audience services. DiffAudit performs differential analysis of network traffic data flows to compare data processing practices (i) between child, adolescent, and adult users and (ii) before and after consent is given and user age is disclosed. We also present a data type classification method that utilizes GPT-4 and our data type ontology based on COPPA and CCPA, allowing us to identify considerably more data types than prior work. Second, we apply DiffAudit to a set of popular general audience mobile and web services and observe a rich set of behaviors extracted from over 440K outgoing requests, containing 3,968 unique data types we extracted and classified. We reveal problematic data processing practices prior to consent and age disclosure, lack of differentiation between age-specific data flows, inconsistent privacy policy disclosures, and sharing of linkable data with third parties, including advertising and tracking services.more » « less
An official website of the United States government

