skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Toward User Control over Information Access: A Sociotechnical Approach
We study the relationship between Web users and service providers, taking a sociotechnical approach and focusing particularly (but not exclusively) on privacy and security of personal data. Much conventional Web-security practice seeks to protect benevolent parties, both individuals and organizations, against purely malev- olent adversaries in an effort to prevent catastrophic events such as data breaches, ransomware attacks, and denial of service. By contrast, we highlight the dynamics among the parties that much conventional security technology seeks to protect. We regard most interactions between users and providers as implicit negotiations that, like the interactions between buyers and sellers in a market- place, have both adversarial and cooperative aspects. Our goal is to rebalance these negotiations in order to give more power to users; toward that end we advocate the adoption of two techniques, one technical and one organizational. Technically, we introduce the Plat- form for Untrusted Resource Evaluation (PURE), a content-labeling framework that empowers users to make informed decisions about service providers, reduces the ability of providers to induce be- haviors that benefit them more than users, and requires minimal time and effort to use. On the organizational side, we concur with Gordon-Tapiero et al. [19] that a collective approach is necessary to rebalance the power dynamics between users and providers; in par- ticular, we suggest that the data co-op, an organizational form sug- gested by Ligett and Nissim [25] and Pentland and Hardjono [28], is a natural setting in which to deploy PURE and similar tools.  more » « less
Award ID(s):
2131541 2131356
PAR ID:
10471383
Author(s) / Creator(s):
;
Publisher / Repository:
ACM
Date Published:
Journal Name:
Proceedings of the 2022 New Security Paradigms Workshop
ISBN:
9781450398664
Page Range / eLocation ID:
117 to 129
Format(s):
Medium: X
Location:
North Conway NH USA
Sponsoring Org:
National Science Foundation
More Like this
  1. Joe Calandrino and Carmela Troncoso (Ed.)
    As service providers are moving to the cloud, users are forced to provision sensitive data to the cloud. Confidential computing leverages hardware Trusted Execution Environment (TEE) to protect data in use, no longer requiring users’ trust to the cloud. The emerging service model, Confidential Computing as a Service (CCaaS), is adopted by service providers to offer service similar to the Function-as-a-Serivce manner. However, privacy concerns are raised in CCaaS, especially in multi-user scenarios. CCaaS need to assure the data providers that the service does not leak their privacy to any unauthorized parties and clear their data after the service. To address such privacy concerns with security guarantees, we first formally define the security objective, Proof of Being Forgotten (PoBF), and prove under which security constraints PoBF can be satisfied. Then, these constraints serve as guidelines in the implementation of the PoBF-compliant Framework (PoCF). PoCF consists of a generic library for different hardware TEEs, CCaaS prototype enclaves, and a verifier to prove PoBF-compliance. PoCF leverages Rust’s robust type system and security features, to construct a verified state machine with privacy-preserving contracts. Last, the experiment results show that the protections introduced by PoCF incur minor runtime performance overhead. 
    more » « less
  2. Privacy technologies support the provision of online services while protecting user privacy. Cryptography lies at the heart of many such technologies, creating remarkable possibilities in terms of functionality while offering robust guarantees of data confidential- ity. The cryptography literature and discourse often represent that these technologies eliminate the need to trust service providers, i.e., they enable users to protect their privacy even against untrusted service providers. Despite their apparent promise, privacy technolo- gies have seen limited adoption in practice, and the most successful ones have been implemented by the very service providers these technologies purportedly protect users from. The adoption of privacy technologies by supposedly adversarial service providers highlights a mismatch between traditional models of trust in cryptography and the trust relationships that underlie deployed technologies in practice. Yet this mismatch, while well known to the cryptography and privacy communities, remains rela- tively poorly documented and examined in the academic literature— let alone broader media. This paper aims to fill that gap. Firstly, we review how the deployment of cryptographic tech- nologies relies on a chain of trust relationships embedded in the modern computing ecosystem, from the development of software to the provision of online services, that is not fully captured by tra- ditional models of trust in cryptography. Secondly, we turn to two case studies—web search and encrypted messaging—to illustrate how, rather than removing trust in service providers, cryptographic privacy technologies shift trust to a broader community of secu- rity and privacy experts and others, which in turn enables service providers to implicitly build and reinforce their trust relationship with users. Finally, concluding that the trust models inherent in the traditional cryptographic paradigm elide certain key trust relation- ships underlying deployed cryptographic systems, we highlight the need for organizational, policy, and legal safeguards to address that mismatch, and suggest some directions for future work. 
    more » « less
  3. Anonymity can enable both healthy online interactions like support-seeking and toxic behaviors like hate speech. How do online service providers balance these threats and opportunities? This two-part qualitative study examines the challenges perceived by open collaboration service providers in allowing anonymous contributions to their projects. We interviewed eleven people familiar with organizational decisions related to privacy and security at five open collaboration projects and followed up with an analysis of public discussions about anonymous contribution to Wikipedia. We contrast our findings with prior work on threats perceived by project volunteers and explore misalignment between policies aiming to serve contributors and the privacy practices of contributors themselves. 
    more » « less
  4. Network security devices intercept, analyze and act on the traffic moving through the network to enforce security policies. They can have adverse impact on the performance, functionality, and privacy provided by the network. To address this issue, we propose a new approach to network security based on the concept of short-term on-demand security exceptions. The basic idea is to bring network providers and (trusted) users together by (1) implementing coarse-grained security policies in the traditional way using conventional in-band security approaches, and (2) handling special cases policy exceptions in the control plane using user/application-supplied information. By divulging their intent to network providers, trusted users can receive better service. By allowing security exceptions, network providers can focus inspections on general (untrusted) traffic. We describe the design of an on-demand security exception mechanism and demonstrate its utility using a prototype implementation that enables high-speed big-data transfer across campus networks. Our experiments show that the security exception mechanism can improve the throughput of flows by trusted users significantly. 
    more » « less
  5. Recent data protection regulations (notably, GDPR and CCPA) grant consumers various rights, including the right to access, modify or delete any personal information collected about them (and retained) by a service provider. To exercise these rights, one must submit a verifiable consumer request proving that the collected data indeed pertains to them. This action is straightforward for consumers with active accounts with a service provider at the time of data collection, since they can use standard (e.g., password-based) means of authentication to validate their requests. However, a major conundrum arises from the need to support consumers without accounts to exercise their rights. To this end, some service providers began requiring such accountless consumers to reveal and prove their identities (e.g., using government-issued documents, utility bills, or credit card numbers) as part of issuing a verifiable consumer request. While understandable as a short-term fix, this approach is cumbersome and expensive for service providers as well as privacy-invasive for consumers. Consequently, there is a strong need to provide better means of authenticating requests from accountless consumers. To achieve this, we propose VICEROY, a privacy-preserving and scalable framework for producing proofs of data ownership, which form a basis for verifiable consumer requests. Building upon existing web techniques and features, VICEROY allows accountless consumers to interact with service providers, and later prove that they are the same person in a privacy-preserving manner, while requiring minimal changes for both parties. We design and implement VICEROY with emphasis on security/privacy, deployability and usability. We also assess its practicality via extensive experiments. 
    more » « less