skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: How to Make Your Approximation Algorithm Private: A Black-Box Differentially-Private Transformation for Tunable Approximation Algorithms of Functions with Low Sensitivity.
Award ID(s):
1910411 1910659 2228814
PAR ID:
10474808
Author(s) / Creator(s):
Publisher / Repository:
Schloss Dagstuhl - Leibniz-Zentrum f{\"{u}}r Informatik
Date Published:
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. In collaborative learning, multiple parties contribute their datasets to jointly deduce global machine learning models for numerous predictive tasks. Despite its efficacy, this learning paradigm fails to encompass critical application domains that involve highly sensitive data, such as healthcare and security analytics, where privacy risks limit entities to individually train models using only their own datasets. In this work, we target privacy-preserving collaborative hierarchical clustering. We introduce a formal security definition that aims to achieve balance between utility and privacy and present a two-party protocol that provably satisfies it. We then extend our protocol with: (i) an optimized version for single-linkage clustering, and (ii) scalable approximation variants. We implement all our schemes and experimentally evaluate their performance and accuracy on synthetic and real datasets, obtaining very encouraging results. For example, end-to-end execution of our secure approximate protocol for over 1M 10-dimensional data samples requires 35sec of computation and achieves 97.09% accuracy. 
    more » « less
  2. The celebrated model of auctions with interdependent valuations, introduced by Milgrom and Weber in 1982, has been studied almost exclusively under private signals $$s_1, \ldots, s_n$$ of the $$n$$ bidders and public valuation functions $$v_i(s_1, \ldots, s_n)$$. Recent work in TCS has shown that this setting admits a constant approximation to the optimal social welfare if the valuations satisfy a natural property called submodularity over signals (SOS). More recently, Eden et al. (2022) have extended the analysis of interdependent valuations to include settings with private signals and \emph{private valuations}, and established $$O(\log^2 n)$$-approximation for SOS valuations. In this paper we show that this setting admits a {\em constant} factor approximation, settling the open question raised by Eden et al. (2022). 
    more » « less
  3. null (Ed.)
    In collaborative learning, multiple parties contribute their datasets to jointly deduce global machine learning models for numerous predictive tasks. Despite its efficacy, this learning paradigm fails to encompass critical application domains that involve highly sensitive data, such as healthcare and security analytics, where privacy risks limit entities to individually train models using only their own datasets. In this work, we target privacy-preserving collaborative hierarchical clustering. We introduce a {formal security definition} that aims to achieve balance between utility and privacy and present a two-party protocol that provably satisfies it. We then extend our protocol with: (i) an {optimized version for single-linkage clustering}, and (ii) {scalable approximation variants}. We implement all our schemes and experimentally evaluate their performance and accuracy on synthetic and real datasets, obtaining very encouraging results. For example, end-to-end execution of our secure approximate protocol for over 1M 10-dimensional data samples requires 35 sec of computation and achieves 97.09\% accuracy. 
    more » « less