skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Cryptography, Trust and Privacy: It's Complicated
Privacy technologies support the provision of online services while protecting user privacy. Cryptography lies at the heart of many such technologies, creating remarkable possibilities in terms of functionality while offering robust guarantees of data confidential- ity. The cryptography literature and discourse often represent that these technologies eliminate the need to trust service providers, i.e., they enable users to protect their privacy even against untrusted service providers. Despite their apparent promise, privacy technolo- gies have seen limited adoption in practice, and the most successful ones have been implemented by the very service providers these technologies purportedly protect users from. The adoption of privacy technologies by supposedly adversarial service providers highlights a mismatch between traditional models of trust in cryptography and the trust relationships that underlie deployed technologies in practice. Yet this mismatch, while well known to the cryptography and privacy communities, remains rela- tively poorly documented and examined in the academic literature— let alone broader media. This paper aims to fill that gap. Firstly, we review how the deployment of cryptographic tech- nologies relies on a chain of trust relationships embedded in the modern computing ecosystem, from the development of software to the provision of online services, that is not fully captured by tra- ditional models of trust in cryptography. Secondly, we turn to two case studies—web search and encrypted messaging—to illustrate how, rather than removing trust in service providers, cryptographic privacy technologies shift trust to a broader community of secu- rity and privacy experts and others, which in turn enables service providers to implicitly build and reinforce their trust relationship with users. Finally, concluding that the trust models inherent in the traditional cryptographic paradigm elide certain key trust relation- ships underlying deployed cryptographic systems, we highlight the need for organizational, policy, and legal safeguards to address that mismatch, and suggest some directions for future work.  more » « less
Award ID(s):
1704527
PAR ID:
10437692
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
2nd ACM Symposium on Computer Science and Law.
Page Range / eLocation ID:
167 to 179
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Joe Calandrino and Carmela Troncoso (Ed.)
    As service providers are moving to the cloud, users are forced to provision sensitive data to the cloud. Confidential computing leverages hardware Trusted Execution Environment (TEE) to protect data in use, no longer requiring users’ trust to the cloud. The emerging service model, Confidential Computing as a Service (CCaaS), is adopted by service providers to offer service similar to the Function-as-a-Serivce manner. However, privacy concerns are raised in CCaaS, especially in multi-user scenarios. CCaaS need to assure the data providers that the service does not leak their privacy to any unauthorized parties and clear their data after the service. To address such privacy concerns with security guarantees, we first formally define the security objective, Proof of Being Forgotten (PoBF), and prove under which security constraints PoBF can be satisfied. Then, these constraints serve as guidelines in the implementation of the PoBF-compliant Framework (PoCF). PoCF consists of a generic library for different hardware TEEs, CCaaS prototype enclaves, and a verifier to prove PoBF-compliance. PoCF leverages Rust’s robust type system and security features, to construct a verified state machine with privacy-preserving contracts. Last, the experiment results show that the protections introduced by PoCF incur minor runtime performance overhead. 
    more » « less
  2. This paper investigates how trust towards service providers and the adoption of privacy controls belonging to two specific purposes (control over “sharing” vs. “usage” of data) vary based on users’ technical literacy. Towards that, we chose Google as the context and conducted an online survey across 209 Google users. Our results suggest that integrity and benevolence perceptions toward Google are significantly lower among technical participants than non-technical participants. While trust perceptions differ between non-technical adopters and non-adopters of privacy controls, no such difference is found among the technical counterparts. Notably, among the non-technical participants, the direction of trust affecting privacy control adoption is observed to be reversed based on the purpose of the controls. Using qualitative analysis, we extract trust-enhancing and dampening factors contributing to users’ trusting beliefs towards Google’s protection of user privacy. The implications of our findings for the design and promotion of privacy controls are discussed in the paper. 
    more » « less
  3. We study the relationship between Web users and service providers, taking a sociotechnical approach and focusing particularly (but not exclusively) on privacy and security of personal data. Much conventional Web-security practice seeks to protect benevolent parties, both individuals and organizations, against purely malev- olent adversaries in an effort to prevent catastrophic events such as data breaches, ransomware attacks, and denial of service. By contrast, we highlight the dynamics among the parties that much conventional security technology seeks to protect. We regard most interactions between users and providers as implicit negotiations that, like the interactions between buyers and sellers in a market- place, have both adversarial and cooperative aspects. Our goal is to rebalance these negotiations in order to give more power to users; toward that end we advocate the adoption of two techniques, one technical and one organizational. Technically, we introduce the Plat- form for Untrusted Resource Evaluation (PURE), a content-labeling framework that empowers users to make informed decisions about service providers, reduces the ability of providers to induce be- haviors that benefit them more than users, and requires minimal time and effort to use. On the organizational side, we concur with Gordon-Tapiero et al. [19] that a collective approach is necessary to rebalance the power dynamics between users and providers; in par- ticular, we suggest that the data co-op, an organizational form sug- gested by Ligett and Nissim [25] and Pentland and Hardjono [28], is a natural setting in which to deploy PURE and similar tools. 
    more » « less
  4. null (Ed.)
    Many companies provide neural network prediction services to users for a wide range of applications. However, current prediction systems compromise one party's privacy: either the user has to send sensitive inputs to the service provider for classification, or the service provider must store its proprietary neural networks on the user's device. The former harms the personal privacy of the user, while the latter reveals the service provider's proprietary model. We design, implement, and evaluate Delphi, a secure prediction system that allows two parties to execute neural network inference without revealing either party's data. Delphi approaches the problem by simultaneously co-designing cryptography and machine learning. We first design a hybrid cryptographic protocol that improves upon the communication and computation costs over prior work. Second, we develop a planner that automatically generates neural network architecture configurations that navigate the performance-accuracy trade-offs of our hybrid protocol. Together, these techniques allow us to achieve a 22x improvement in online prediction latency compared to the state-of-the-art prior work. 
    more » « less
  5. In recent years, we have witnessed a rise in the popularity of net- worked hospitality services (NHSs), an online marketplace for short-term peer- to-peer accommodations. Such systems, however, raise significant privacy con- cerns, because service providers such as Airbnb and 9flats can easily collect the precise and personal information of millions of participating hosts and guests through their centralized online platforms. In this paper, we propose PrivateNH, a privacy-enhancing and practical solution that offers anonymity and accountabil- ity for NHS users without relying on any trusted third party. PrivateNH leverages the recent progress of Bitcoin techniques such as Colored Coins and CoinShuffle to generate and maintain anonymous credentials for NHS participants. The cre- dential holders (NHS hosts or guests) can then lease or rent short-term lodging and interact with the service provider in an anonymous and accountable man- ner. An anonymous and secure reputation system is also introduced to establish the trust between unfamiliar hosts and guests in a peer-to-peer fashion. The pro- posed scheme is compatible with the current Bitcoin blockchain system, and its effectiveness and feasibility in NHS scenario are also demonstrated by security analysis and performance evaluation. 
    more » « less