skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Stochastic Minimum Vertex Cover in General Graphs: A 3/2-Approximation
We study the stochastic vertex cover problem. In this problem, G = (V, E) is an arbitrary known graph, and G⋆ is an unknown random subgraph of G where each edge e is realized independently with probability p. Edges of G⋆ can only be verified using edge queries. The goal in this problem is to find a minimum vertex cover of G⋆ using a small number of queries. Our main result is designing an algorithm that returns a vertex cover of G⋆ with size at most (3/2+є) times the expected size of the minimum vertex cover, using only O(n/є p) non-adaptive queries. This improves over the best-known 2-approximation algorithm by Behnezhad, Blum and Derakhshan [SODA’22] who also show that Ω(n/p) queries are necessary to achieve any constant approximation. Our guarantees also extend to instances where edge realizations are not fully independent. We complement this upperbound with a tight 3/2-approximation lower bound for stochastic graphs whose edges realizations demonstrate mild correlations.  more » « less
Award ID(s):
2145898
PAR ID:
10482192
Author(s) / Creator(s):
; ;
Publisher / Repository:
ACM
Date Published:
ISBN:
9781450399135
Page Range / eLocation ID:
242 to 253
Format(s):
Medium: X
Location:
Orlando FL USA
Sponsoring Org:
National Science Foundation
More Like this
  1. Meka, Raghu (Ed.)
    We study the stochastic minimum vertex cover problem for general graphs. In this problem, we are given a graph G=(V, E) and an existence probability p_e for each edge e ∈ E. Edges of G are realized (or exist) independently with these probabilities, forming the realized subgraph H. The existence of an edge in H can only be verified using edge queries. The goal of this problem is to find a near-optimal vertex cover of H using a small number of queries. Previous work by Derakhshan, Durvasula, and Haghtalab [STOC 2023] established the existence of 1.5 + ε approximation algorithms for this problem with O(n/ε) queries. They also show that, under mild correlation among edge realizations, beating this approximation ratio requires querying a subgraph of size Ω(n ⋅ RS(n)). Here, RS(n) refers to Ruzsa-Szemerédi Graphs and represents the largest number of induced edge-disjoint matchings of size Θ(n) in an n-vertex graph. In this work, we design a simple algorithm for finding a (1 + ε) approximate vertex cover by querying a subgraph of size O(n ⋅ RS(n)) for any absolute constant ε > 0. Our algorithm can tolerate up to O(n ⋅ RS(n)) correlated edges, hence effectively completing our understanding of the problem under mild correlation. 
    more » « less
  2. We study the fully dynamic All-Pairs Shortest Paths (APSP) problem in undirected edge-weighted graphs. Given an n-vertex graph G with non-negative edge lengths, that undergoes an online sequence of edge insertions and deletions, the goal is to support approximate distance queries and shortest-path queries. We provide a deterministic algorithm for this problem, that, for a given precision parameter є, achieves approximation factor (loglogn)2O(1/є3), and has amortized update time O(nєlogL) per operation, where L is the ratio of longest to shortest edge length. Query time for distance-query is O(2O(1/є)· logn· loglogL), and query time for shortest-path query is O(|E(P)|+2O(1/є)· logn· loglogL), where P is the path that the algorithm returns. To the best of our knowledge, even allowing any o(n)-approximation factor, no adaptive-update algorithms with better than Θ(m) amortized update time and better than Θ(n) query time were known prior to this work. We also note that our guarantees are stronger than the best current guarantees for APSP in decremental graphs in the adaptive-adversary setting. In order to obtain these results, we consider an intermediate problem, called Recursive Dynamic Neighborhood Cover (RecDynNC), that was formally introduced in [Chuzhoy, STOC ’21]. At a high level, given an undirected edge-weighted graph G undergoing an online sequence of edge deletions, together with a distance parameter D, the goal is to maintain a sparse D-neighborhood cover of G, with some additional technical requirements. Our main technical contribution is twofolds. First, we provide a black-box reduction from APSP in fully dynamic graphs to the RecDynNC problem. Second, we provide a new deterministic algorithm for the RecDynNC problem, that, for a given precision parameter є, achieves approximation factor (loglogm)2O(1/є2), with total update time O(m1+є), where m is the total number of edges ever present in G. This improves the previous algorithm of [Chuzhoy, STOC ’21], that achieved approximation factor (logm)2O(1/є) with similar total update time. Combining these two results immediately leads to the deterministic algorithm for fully-dynamic APSP with the guarantees stated above. 
    more » « less
  3. We consider the classical Minimum Crossing Number problem: given an n-vertex graph G, compute a drawing of G in the plane, while minimizing the number of crossings between the images of its edges. This is a fundamental and extensively studied problem, whose approximability status is widely open. In all currently known approximation algorithms, the approximation factor depends polynomially on Δ – the maximum vertex degree in G. The best current approximation algorithm achieves an O(n1/2−· (Δ·logn))-approximation, for a small fixed constant є, while the best negative result is APX-hardness, leaving a large gap in our understanding of this basic problem. In this paper we design a randomized O(2O((logn)7/8loglogn)·(Δ))-approximation algorithm for Minimum Crossing Number. This is the first approximation algorithm for the problem that achieves a subpolynomial in n approximation factor (albeit only in graphs whose maximum vertex degree is subpolynomial in n). In order to achieve this approximation factor, we design a new algorithm for a closely related problem called Crossing Number with Rotation System, in which, for every vertex v∈ V(G), the circular ordering, in which the images of the edges incident to v must enter the image of v in the drawing is fixed as part of input. Combining this result with the recent reduction of [Chuzhoy, Mahabadi, Tan ’20] immediately yields the improved approximation algorithm for Minimum Crossing Number. 
    more » « less
  4. null (Ed.)
    The problem of sparsifying a graph or a hypergraph while approximately preserving its cut structure has been extensively studied and has many applications. In a seminal work, Benczúr and Karger (1996) showed that given any n-vertex undirected weighted graph G and a parameter ε ∈ (0,1), there is a near-linear time algorithm that outputs a weighted subgraph G' of G of size Õ(n/ε²) such that the weight of every cut in G is preserved to within a (1 ± ε)-factor in G'. The graph G' is referred to as a (1 ± ε)-approximate cut sparsifier of G. Subsequent recent work has obtained a similar result for the more general problem of hypergraph cut sparsifiers. However, all known sparsification algorithms require Ω(n + m) time where n denotes the number of vertices and m denotes the number of hyperedges in the hypergraph. Since m can be exponentially large in n, a natural question is if it is possible to create a hypergraph cut sparsifier in time polynomial in n, independent of the number of edges. We resolve this question in the affirmative, giving the first sublinear time algorithm for this problem, given appropriate query access to the hypergraph. Specifically, we design an algorithm that constructs a (1 ± ε)-approximate cut sparsifier of a hypergraph H(V,E) in polynomial time in n, independent of the number of hyperedges, when given access to the hypergraph using the following two queries: 1) given any cut (S, ̄S), return the size |δ_E(S)| (cut value queries); and 2) given any cut (S, ̄S), return a uniformly at random edge crossing the cut (cut edge sample queries). Our algorithm outputs a sparsifier with Õ(n/ε²) edges, which is essentially optimal. We then extend our results to show that cut value and cut edge sample queries can also be used to construct hypergraph spectral sparsifiers in poly(n) time, independent of the number of hyperedges. We complement the algorithmic results above by showing that any algorithm that has access to only one of the above two types of queries can not give a hypergraph cut sparsifier in time that is polynomial in n. Finally, we show that our algorithmic results also hold if we replace the cut edge sample queries with a pair neighbor sample query that for any pair of vertices, returns a random edge incident on them. In contrast, we show that having access only to cut value queries and queries that return a random edge incident on a given single vertex, is not sufficient. 
    more » « less
  5. We give an algorithm to find a minimum cut in an edge-weighted directed graph with n vertices and m edges in O ̃(n · max{m^{2/3}, n}) time. This improves on the 30 year old bound of O ̃(nm) obtained by Hao and Orlin for this problem. Using similar techniques, we also obtain O ̃ (n^2 /ε^2 )-time (1+ε)-approximation algorithms for both the minimum edge and minimum vertex cuts in directed graphs, for any fixed ε. Before our work, no (1+ε)-approximation algorithm better than the exact runtime of O ̃(nm) is known for either problem. Our algorithms follow a two-step template. In the first step, we employ a partial sparsification of the input graph to preserve a critical subset of cut values approximately. In the second step, we design algorithms to find the (edge/vertex) mincut among the preserved cuts from the first step. For edge mincut, we give a new reduction to O ̃ (min{n/m^{1/3} , √n}) calls of any maxflow subroutine, via packing arborescences in the sparsifier. For vertex mincut, we develop new local flow algorithms to identify small unbalanced cuts in the sparsified graph. 
    more » « less