skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Projected federated averaging with heterogeneous differential privacy
Federated Learning (FL) is a promising framework for multiple clients to learn a joint model without directly sharing the data. In addition to high utility of the joint model, rigorous privacy protection of the data and communication efficiency are important design goals. Many existing efforts achieve rigorous privacy by ensuring differential privacy for intermediate model parameters, however, they assume a uniform privacy parameter for all the clients. In practice, different clients may have different privacy requirements due to varying policies or preferences. In this paper, we focus on explicitly modeling and leveraging the heterogeneous privacy requirements of different clients and study how to optimize utility for the joint model while minimizing communication cost. As differentially private perturbations affect the model utility, a natural idea is to make better use of information submitted by the clients with higher privacy budgets (referred to as "public" clients, and the opposite as "private" clients). The challenge is how to use such information without biasing the joint model. We propose P rojected F ederated A veraging (PFA), which extracts the top singular subspace of the model updates submitted by "public" clients and utilizes them to project the model updates of "private" clients before aggregating them. We then propose communication-efficient PFA+, which allows "private" clients to upload projected model updates instead of original ones. Our experiments verify the utility boost of both algorithms compared to the baseline methods, whereby PFA+ achieves over 99% uplink communication reduction for "private" clients.  more » « less
Award ID(s):
2125530 1952192 2124104 1838200
PAR ID:
10332814
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Proceedings of the VLDB Endowment
Volume:
15
Issue:
4
ISSN:
2150-8097
Page Range / eLocation ID:
828 to 840
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Providing privacy protection has been one of the primary motivations of Federated Learning (FL). Recently, there has been a line of work on incorporating the formal privacy notion of differential privacy with FL. To guarantee the client-level differential privacy in FL algorithms, the clients’ transmitted model updates have to be clipped before adding privacy noise. Such clipping operation is substantially different from its counterpart of gradient clipping in the centralized differentially private SGD and has not been well-understood. In this paper, we first empirically demonstrate that the clipped FedAvg can perform surprisingly well even with substantial data heterogeneity when training neural networks, which is partly because the clients’ updates become similar for several popular deep architectures. Based on this key observation, we provide the convergence analysis of a differential private (DP) FedAvg algorithm and highlight the relationship between clipping bias and the distribution of the clients’ updates. To the best of our knowledge, this is the first work that rigorously investigates theoretical and empirical issues regarding the clipping operation in FL algorithms. 
    more » « less
  2. A status updating system is considered in which a source updates a destination over an erasure channel. The utility of the updates is measured through a function of their age-of-information (AoI), which assesses their freshness. Correlated with the status updates is another process that needs to be kept private from the destination. Privacy is measured through a leakage function that depends on the amount and time of the status updates received: stale updates are more private than fresh ones. Different from most of the current AoI literature, a post-sampling waiting time is introduced in order to provide a privacy cover at the expense of AoI. More importantly, it is also shown that, depending on the leakage budget and the channel statistics, it can be useful to retransmit stale status updates following erasure events without resampling fresh ones. 
    more » « less
  3. We perform a rigorous study of private matrix analysis when only the last 𝑊 updates to matrices are considered useful for analysis. We show the existing framework in the non-private setting is not robust to noise required for privacy. We then propose a framework robust to noise and use it to give first efficient 𝑜(𝑊) space differentially private algorithms for spectral approximation, principal component analysis (PCA), multi-response linear regression, sparse PCA, and non-negative PCA. Prior to our work, no such result was known for sparse and non-negative differentially private PCA even in the static data setting. We also give a lower bound to demonstrate the cost of privacy in the sliding window model. 
    more » « less
  4. Increasingly large trip demands have strained urban transportation capacity, which consequently leads to traffic congestion and rapid growth of greenhouse gas emissions. In this work, we focus on achieving sustainable transportation by incentivizing passengers to switch from private cars to public transport. We address the following challenges. First, the passengers incur inconvenience costs when changing their transit behaviors due to delay and discomfort, and thus need to be reimbursed. Second, the inconvenience cost, however, is unknown to the government when choosing the incentives. Furthermore, changing transit behaviors raises privacy concerns from passengers. An adversary could infer personal information (e.g., daily routine, region of interest, and wealth) by observing the decisions made by the government, which are known to the public. We adopt the concept of differential privacy and propose privacy-preserving incentive designs under two settings, denoted as two-way communication and one-way communication. Under two-way communication, passengers submit bids and then the government determines the incentives, whereas in one-way communication, the government simply sets a price without acquiring information from the passengers. We formulate the problem under two-way communication as a mixed integer linear program and propose a polynomial-time approximation algorithm. We show the proposed approach achieves truthfulness, individual rationality, social optimality, and differential privacy. Under one-way communication, we focus on how the government should design the incentives without revealing passengers’ inconvenience costs while still preserving differential privacy. We formulate the problem as a convex program and propose a differentially private and near-optimal solution algorithm. A numerical case study using the Caltrans Performance Measurement System (PeMS) data source is presented as evaluation. The results show that the proposed approaches achieve a win-win situation in which both the government and passengers obtain non-negative utilities. 
    more » « less
  5. We present Secrecy, a system for privacy-preserving collaborative analytics as a service. Secrecy allows multiple data holders to contribute their data towards a joint analysis in the cloud, while keeping the data siloed even from the cloud providers. At the same time, it enables cloud providers to offer their services to clients who would have otherwise refused to perform a computation altogether or insisted that it be done on private infrastructure. Secrecy ensures no information leakage and provides provable security guarantees by employing cryptographically secure Multi-Party Computation (MPC). In Secrecy we take a novel approach to optimizing MPC execution by co-designing multiple layers of the system stack and exposing the MPC costs to the query engine. To achieve practical performance, Secrecy applies physical optimizations that amortize the inherent MPC overheads along with logical optimizations that dramatically reduce the computation, communication, and space requirements during query execution. Our multi-cloud experiments demonstrate that Secrecy improves query performance by over 1000x compared to existing approaches and computes complex analytics on millions of data records with modest use of resources. 
    more » « less