skip to main content


Search for: All records

Creators/Authors contains: "Hawkins, Calvin"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. This paper develops a framework for privatizing the spectrum of the Laplacian of an undirected graph using differential privacy. We consider two privacy formulations. The first obfuscates the presence of edges in the graph and the second obfuscates the presence of nodes. We compare these two privacy formulations and show that the privacy formulation that considers edges is better suited to most engineering applications. We use the bounded Laplace mechanism to provide (epsilon, delta)-differential privacy to the eigenvalues of a graph Laplacian, and we pay special attention to the algebraic connectivity, which is the Laplacian's the second smallest eigenvalue. Analytical bounds are presented on the accuracy of the mechanisms and on certain graph properties computed with private spectra. A suite of numerical examples confirms the accuracy of private spectra in practice. 
    more » « less
    Free, publicly-accessible full text available March 1, 2025
  2. Stochastic matrices are commonly used to analyze Markov chains, but revealing them can leak sensitive information. Therefore, in this paper we introduce a technique to privatize stochastic matrices in a way that (i) conceals the probabilities they contain, and (ii) still allows for accurate analyses of Markov chains. Specifically, we use differential privacy, which is a statistical framework for protecting sensitive data. To implement it, we introduce the Matrix Dirichlet Mechanism, which is a probabilistic mapping that perturbs a stochastic matrix to provide privacy. We prove that this mechanism provides differential privacy, and we quantify the error induced in private stochastic matrices as a function of the strength of privacy being provided. We then bound the distance between the stationary distribution of the underlying, sensitive stochastic matrix and the stationary distribution of its privatized form. Numerical results show that, under typical conditions, privacy introduces error as low as 5.05% in the stationary distribution of a stochastic matrix. 
    more » « less
  3. Graphs are the dominant formalism for modeling multi-agent systems. The algebraic connectivity of a graph is particularly important because it provides the convergence rates of consensus algorithms that underlie many multi-agent control and optimization techniques. However, sharing the value of algebraic connectivity can inadvertently reveal sensitive information about the topology of a graph, such as connections in social networks. Therefore, in this work we present a method to release a graph’s algebraic connectivity under a graph-theoretic form of differential privacy, called edge differential privacy. Edge differential privacy obfuscates differences among graphs’ edge sets and thus conceals the absence or presence of sensitive connections therein. We provide privacy with bounded Laplace noise, which improves accuracy relative to conventional unbounded noise. The private algebraic connectivity values are analytically shown to provide accurate estimates of consensus convergence rates, as well as accurate bounds on the diameter of a graph and the mean distance between its nodes. Simulation results confirm the utility of private algebraic connectivity in these contexts. 
    more » « less
  4. null (Ed.)
    As multi-agent systems proliferate, there is in-creasing demand for coordination protocols that protect agents’ sensitive information while allowing them to collaborate. To help address this need, this paper presents a differentially private formation control framework. Agents’ state trajectories are protected using differential privacy, which is a statistical notion of privacy that protects data by adding noise to it. We provide a private formation control implementation and analyze the impact of privacy upon the system. Specifically, we quantify tradeoffs between privacy level, system performance, and connectedness of the network’s communication topology. These tradeoffs are used to develop guidelines for calibrating privacy in terms of control theoretic quantities, such as steady-state error, without requiring in-depth knowledge of differential privacy. Additional guidelines are also developed for treating privacy levels and network topologies as design parameters to tune the network’s performance. Simulation results illustrate these tradeoffs and show that strict privacy is inherently compatible with strong system performance. 
    more » « less