skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Quantum Pufferfish Privacy: A Flexible Privacy Framework for Quantum Systems
We propose a versatile privacy framework for quantum systems, termed quantum pufferfish privacy (QPP). Inspired by classical pufferfish privacy, our formulation generalizes and addresses limitations of quantum differential privacy by offering flexibility in specifying private information, feasible measurements, and domain knowledge. We show that QPP can be equivalently formulated in terms of the Datta–Leditzky information spectrum divergence, thus providing the first operational interpretation thereof. We reformulate this divergence as a semi-definite program and derive several properties of it, which are then used to prove convexity, composability, and post-processing of QPP mechanisms. Parameters that guarantee QPP of the depolarization mechanism are also derived. We analyze the privacy-utility tradeoff of general QPP mechanisms and, again, study the depolarization mechanism as an explicit instance. The QPP framework is then applied to privacy auditing for identifying privacy violations via a hypothesis testing pipeline that leverages quantum algorithms. Connections to quantum fairness and other quantum divergences are also explored and several variants of QPP are examined.  more » « less
Award ID(s):
2315398 2046018
PAR ID:
10534452
Author(s) / Creator(s):
; ;
Publisher / Repository:
IEEE
Date Published:
Journal Name:
IEEE Transactions on Information Theory
Volume:
70
Issue:
8
ISSN:
0018-9448
Page Range / eLocation ID:
5731 to 5762
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. As network services progress and mobile and IoT environments expand, numerous security concerns have surfaced for spectrum access systems (SASs). The omnipresent risk of Denial-of-Service (DoS) attacks and raising concerns about user privacy (e.g., location privacy, anonymity) are among such cyber threats. These security and privacy risks increase due to the threat of quantum computers that can compromise longterm security by circumventing conventional cryptosystems and increasing the cost of countermeasures. While some defense mechanisms exist against these threats in isolation, there is a significant gap in the state of the art on a holistic solution against DoS attacks with privacy and anonymity for spectrum management systems, especially when post-quantum (PQ) security is in mind. In this paper, we propose a new cybersecurity framework, PACDoSQ, which is the first to offer location privacy and anonymity for spectrum management with counter DoS and PQ security simultaneously. Our solution introduces the private spectrum bastion concept to exploit existing architectural features of SASs and then synergizes them with multi-server private information retrieval and PQ-secure Tor to guarantee a location-private and anonymous acquisition of spectrum information, together with hash-based client-server puzzles for counter DoS. We prove that PACDoSQ achieves its security objectives and show its feasibility via a comprehensive performance evaluation. 
    more » « less
  2. Differential privacy (DP) is a widely used notion for reasoning about privacy when publishing aggregate data. In this paper, we observe that certain DP mechanisms are amenable to a posteriori privacy analysis that exploits the fact that some outputs leak less information about the input database than others. To exploit this phenomenon, we introduce output differential privacy (ODP) and a new composition experiment, and leverage these new constructs to obtain significant privacy budget savings and improved privacy–utility tradeoffs under composition. All of this comes at no cost in terms of privacy; we do not weaken the privacy guarantee. To demonstrate the applicability of our a posteriori privacy analysis techniques, we analyze two well-known mechanisms: the Sparse Vector Technique and the Propose-Test-Release framework. We then show how our techniques can be used to save privacy budget in more general contexts: when a differentially private iterative mechanism terminates before its maximal number of iterations is reached, and when the output of a DP mechanism provides unsatisfactory utility. Examples of the former include iterative optimization algorithms, whereas examples of the latter include training a machine learning model with a large generalization error. Our techniques can be applied beyond the current paper to refine the analysis of existing DP mechanisms or guide the design of future mechanisms. 
    more » « less
  3. This paper introduces a differentially private (DP) mechanism to protect the information exchanged during the coordination of sequential and inter- dependent markets. This coordination represents a classic Stackelberg game and relies on the ex- change of sensitive information between the sys- tem agents. The paper is motivated by the observa- tion that the perturbation introduced by traditional DP mechanisms fundamentally changes the under- lying optimization problem and even leads to un- satisfiable instances. To remedy such limitation, the paper introduces the Privacy-Preserving Stack- elberg Mechanism (PPSM), a framework that en- forces the notions of feasibility and fidelity (i.e. near-optimality) of the privacy-preserving informa- tion to the original problem objective. PPSM com- plies with the notion of differential privacy and en- sures that the outcomes of the privacy-preserving coordination mechanism are close-to-optimality for each agent. Experimental results on several gas and electricity market benchmarks based on a real case study demonstrate the effectiveness of the proposed approach. A full version of this paper [Fioretto et al., 2020b] contains complete proofs and additional discussion on the motivating application. 
    more » « less
  4. Weller, Adrian (Ed.)
    While differential privacy (DP) offers strong theoretical privacy guarantees, implementations of DP mechanisms may be vulnerable to side-channel attacks, such as timing attacks. When sampling methods such as MCMC or rejection sampling are used to implement a privacy mechanism, the runtime can leak private information. We characterize the additional privacy cost due to the runtime of a rejection sampler in terms of both (, δ)-DP as well as f -DP. We also show that unless the acceptance probability is constant across databases, the runtime of a rejection sampler does not satisfy -DP for any . We show that there is a similar breakdown in privacy with adaptive rejection samplers. We propose three modifications to the rejection sampling algorithm, with varying assumptions, to protect against timing attacks by making the runtime independent of the data. The modification with the weakest assumptions is an approximate sampler, introducing a small increase in the privacy cost, whereas the other modifications give perfect samplers. We also use our techniques to develop an adaptive rejection sampler for log-H ̈older densities, which also has data-independent runtime. We give several examples of DP mechanisms that fit the assumptions of our methods and can thus be implemented using our samplers. 
    more » « less
  5. Weller, Adrian (Ed.)
    Differential privacy (DP) offers strong theoretical privacy guarantees, though implementations of DP mechanisms may be vulnerable to side-channel attacks, such as timing attacks. When sampling methods such as MCMC or rejection sampling are used to implement a mechanism, the runtime can leak private information. We characterize the additional privacy cost due to the runtime of a rejection sampler in terms of both (epsilon,delta)-DP as well as f-DP. We also show that unless the acceptance probability is constant across databases, the runtime of a rejection sampler does not satisfy epsilon-DP for any epsilon. We show that there is a similar breakdown in privacy with adaptive rejection samplers. We propose three modifications to the rejection sampling algorithm, with varying assumptions, to protect against timing attacks by making the runtime independent of the data. The modification with the weakest assumptions is an approximate sampler, introducing a small increase in the privacy cost, whereas the other modifications give perfect samplers. We also use our techniques to develop an adaptive rejection sampler for log-Holder densities, which also has data-independent runtime. We give several examples of DP mechanisms that fit the assumptions of our methods and can thus be implemented using our samplers. 
    more » « less