Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            Free, publicly-accessible full text available July 1, 2026
- 
            Oh, A; Naumann, T; Globerson, A; Saenko, K; Hardt, M; Levine, S (Ed.)Given a set of points of interest, a volumetric spanner is a subset of the points using which all the points can be expressed using “small” coefficients (measured in an appropriate norm). This notion, which has also been referred to as a well-conditioned basis, has found several applications, including bandit linear optimization, determinant maximization, and matrix low rank approximation. In this paper, we give almost optimal bounds on the size of volumetric spanners for all L_p norms, and show that they can be constructed using a simple local search procedure. We then show the applications of our result to other tasks and in particular the problem of finding coresets for the Minimum Volume Enclosing Ellipsoid (MVEE) problem.more » « less
- 
            Similarity search is a fundamental algorithmic primitive, widely used in many computer science disciplines. Given a set of points S and a radius parameter r > 0, the r -near neighbor ( r -NN) problem asks for a data structure that, given any query point q , returns a point p within distance at most r from q. In this paper, we study the r -NN problem in the light of individual fairness and providing equal opportunities: all points that are within distance r from the query should have the same probability to be returned. The problem is of special interest in high dimensions, where Locality Sensitive Hashing (LSH), the theoretically leading approach to similarity search, does not provide any fairness guarantee. In this work, we show that LSH-based algorithms can be made fair, without a significant loss in efficiency. We propose several efficient data structures for the exact and approximate variants of the fair NN problem. Our approach works more generally for sampling uniformly from a sub-collection of sets of a given collection and can be used in a few other applications. We also carried out an experimental evaluation that highlights the inherent unfairness of existing NN data structures.more » « less
- 
            Buchin, Kevin; Colin de Verdiere, Eric (Ed.)In this paper, we prove a two-sided variant of the Kirszbraun theorem. Consider an arbitrary subset X of Euclidean space and its superset Y. Let f be a 1-Lipschitz map from X to ℝ^m. The Kirszbraun theorem states that the map f can be extended to a 1-Lipschitz map ̃ f from Y to ℝ^m. While the extension ̃ f does not increase distances between points, there is no guarantee that it does not decrease distances significantly. In fact, ̃ f may even map distinct points to the same point (that is, it can infinitely decrease some distances). However, we prove that there exists a (1 + ε)-Lipschitz outer extension f̃:Y → ℝ^{m'} that does not decrease distances more than "necessary". Namely, ‖f̃(x) - f̃(y)‖ ≥ c √{ε} min(‖x-y‖, inf_{a,b ∈ X} (‖x - a‖ + ‖f(a) - f(b)‖ + ‖b-y‖)) for some absolutely constant c > 0. This bound is asymptotically optimal, since no L-Lipschitz extension g can have ‖g(x) - g(y)‖ > L min(‖x-y‖, inf_{a,b ∈ X} (‖x - a‖ + ‖f(a) - f(b)‖ + ‖b-y‖)) even for a single pair of points x and y. In some applications, one is interested in the distances ‖f̃(x) - f̃(y)‖ between images of points x,y ∈ Y rather than in the map f̃ itself. The standard Kirszbraun theorem does not provide any method of computing these distances without computing the entire map ̃ f first. In contrast, our theorem provides a simple approximate formula for distances ‖f̃(x) - f̃(y)‖.more » « less
- 
            
- 
            
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                     Full Text Available
                                                Full Text Available