Graphs are the dominant formalism for modeling multi-agent systems. The algebraic connectivity of a graph is particularly important because it provides the convergence rates of consensus algorithms that underlie many multi-agent control and optimization techniques. However, sharing the value of algebraic connectivity can inadvertently reveal sensitive information about the topology of a graph, such as connections in social networks. Therefore, in this work we present a method to release a graph’s algebraic connectivity under a graph-theoretic form of differential privacy, called edge differential privacy. Edge differential privacy obfuscates differences among graphs’ edge sets and thus conceals the absence or presence of sensitive connections therein. We provide privacy with bounded Laplace noise, which improves accuracy relative to conventional unbounded noise. The private algebraic connectivity values are analytically shown to provide accurate estimates of consensus convergence rates, as well as accurate bounds on the diameter of a graph and the mean distance between its nodes. Simulation results confirm the utility of private algebraic connectivity in these contexts.
more »
« less
Node and Edge Differential Privacy for Graph Laplacian Spectra: Mechanisms and Scaling Laws
This paper develops a framework for privatizing the spectrum of the Laplacian of an undirected graph using differential privacy. We consider two privacy formulations. The first obfuscates the presence of edges in the graph and the second obfuscates the presence of nodes. We compare these two privacy formulations and show that the privacy formulation that considers edges is better suited to most engineering applications. We use the bounded Laplace mechanism to provide (epsilon, delta)-differential privacy to the eigenvalues of a graph Laplacian, and we pay special attention to the algebraic connectivity, which is the Laplacian's the second smallest eigenvalue. Analytical bounds are presented on the accuracy of the mechanisms and on certain graph properties computed with private spectra. A suite of numerical examples confirms the accuracy of private spectra in practice.
more »
« less
- Award ID(s):
- 1943275
- PAR ID:
- 10494459
- Publisher / Repository:
- IEEE Xplore
- Date Published:
- Journal Name:
- IEEE Transactions on Network Science and Engineering
- Volume:
- 11
- Issue:
- 2
- ISSN:
- 2334-329X
- Page Range / eLocation ID:
- 1690 to 1701
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
In this paper, we develop distributed computation algorithms for Nash equilibriums of linear quadratic network games with proven differential privacy guarantees. In a network game with each player's payoff being a quadratic function, the dependencies of the decisions in the payoff function naturally encode a network structure governing the players' inter-personal influences. Such social influence structure and the individual marginal payoffs of the players indicate economic spillovers and individual preferences, and thus they are subject to privacy concerns. For distributed computing of the Nash equilibrium, the players are interconnected by a public communication graph, over which dynamical states are shared among neighboring nodes. When the players' marginal payoffs are considered to be private knowledge, we propose a distributed randomized gradient descent algorithm, in which each player adds a Laplacian random noise to her marginal payoff in the recursive updates. It is proven that the algorithm can guarantee differential privacy and convergence in expectation to the Nash equilibrium of the network game at each player's state. Moreover, the mean-square error between the players' states and the Nash equilibrium is shown to be bounded by a constant related to the differential privacy level. Next, when both the players' marginal payoffs and the influence graph are private information, we propose two distributed algorithms by randomized communication and randomized projection, respectively, for privacy preservation. The differential privacy and convergence guarantees are also established for such algorithms.more » « less
-
There has been considerable controversy regarding the accuracy and privacy of de-identification mechanisms used in the U.S. Decennial Census. We theoretically and experimentally analyze two such classes of mechanisms, swapping and differential privacy, especially examining their effects on ethnoracial minority groups. We first prove that the expected error of queries made on swapped demographic datasets is greater in sub-populations whose racial distributions differ more from the racial distribution of the global population. We also prove that the probability that m unique entries exist in a sub-population shrinks exponentially as the sub-population size grows. These properties suggest that swapping, which prioritizes unique entries, will produce poor accuracy for minority groups. We then empirically analyze the impact of swapping and differential privacy on the accuracy and privacy of a de- mographic dataset. We evaluate accuracy in several ways, including methods that stress the effect on minority groups. We evaluate privacy by counting the number of re-identified entries in a simulated linkage attack. Finally, we explore the disproportionate presence of minority groups in identified entries. Our empirical findings corroborate our theoretical results: for minority representation, the utility of differential privacy is comparable to the utility of swapping, while providing a stronger privacy guarantee. Swapping places a disproportionate privacy burden on minority groups, whereas an ϵ- differentially private mechanism is ϵ-differentially private for all subgroups.more » « less
-
Differential privacy has emerged as a gold standard in privacy-preserving data analysis. A popular variant is local differential privacy, where the data holder is the trusted curator. A major barrier, however, towards a wider adoption of this model is that it offers a poor privacy-utility tradeoff. In this work, we address this problem by introducing a new variant of local privacy called profile-based privacy. The central idea is that the problem setting comes with a graph G of data generating distributions, whose edges encode sensitive pairs of distributions that should be made indistinguishable. This provides higher utility because unlike local differential privacy, we no longer need to make every pair of private values in the domain indistinguishable, and instead only protect the identity of the underlying distribution. We establish privacy properties of the profile-based privacy definition, such as post-processing invariance and graceful composition. Finally, we provide mechanisms that are private in this framework, and show via simulations that they achieve higher utility than the corresponding local differential privacy mechanisms.more » « less
-
Machine Learning (ML) algorithms have shown quite promising applications in smart meter data analytics enabling intelligent energy management systems for the Advanced Metering Infrastructure (AMI). One of the major challenges in developing ML applications for the AMI is to preserve user privacy while allowing active end-users participation. This paper addresses this challenge and proposes Differential Privacy-enabled AMI with Federated Learning (DP-AMI-FL), framework for ML-based applications in the AMI. This framework provides two layers of privacy protection: first, it keeps the raw data of consumers hosting ML applications at edge devices (smart meters) with Federated Learning (FL), and second, it obfuscates the ML models using Differential Privacy (DP) to avoid privacy leakage threats on the models posed by various inference attacks. The framework is evaluated by analyzing its performance on a use case aimed to improve Short-Term Load Forecasting (STLF) for residential consumers having smart meters and home energy management systems. Extensive experiments demonstrate that the framework when used with Long Short-Term Memory (LSTM) recurrent neural network models, achieves high forecasting accuracy while preserving users data privacy.more » « less
An official website of the United States government

