Current flow closeness centrality (CFCC) has a better discriminating ability than the ordinary closeness centrality based on shortest paths. In this paper, we extend this notion to a group of vertices in a weighted graph, and then study the problem of finding a subset S of k vertices to maximize its CFCC C(S), both theoretically and experimentally. We show that the problem is NP-hard, but propose two greedy algorithms for minimizing the reciprocal of C(S) with provable guarantees using the monotoncity and supermodularity. The first is a deterministic algorithm with an approximation factor (1−kk−1⋅1e) and cubic running time; while the second is a randomized algorithm with a (1−kk−1⋅1e−ϵ)-approximation and nearly-linear running time for any ϵ>0. Extensive experiments on model and real networks demonstrate that our algorithms are effective and efficient, with the second algorithm being scalable to massive networks with more than a million vertices.
more »
« less
Sampling and Reconstruction of Graph Signals via Weak Submodularity and Semidefinite Relaxation
We study the problem of sampling a bandlimited graph signal in the presence of noise, where the objective is to select a node subset of prescribed cardinality that minimizes the signal reconstruction mean squared error (MSE). To that end, we formulate the task at hand as the minimization of MSE subject to binary constraints, and approximate the resulting NP-hard problem via semidefinite programming (SDP) relaxation. Moreover, we provide an alternative formulation based on maximizing a monotone weak submodular function and propose a randomized-greedy algorithm to find a sub-optimal subset. We then derive a worst-case performance guarantee on the MSE returned by the randomized greedy algorithm for general non-stationary graph signals. The efficacy of the proposed methods is illustrated through numerical simulations on synthetic and realworld graphs. Notably, the randomized greedy algorithm yields an order-of-magnitude speedup over state-of-the-art greedy sampling schemes, while incurring only a marginal MSE performance loss.
more »
« less
- Award ID(s):
- 1750428
- PAR ID:
- 10087938
- Date Published:
- Journal Name:
- 2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- Page Range / eLocation ID:
- 4179 to 4183
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
We describe a 3/2-approximation algorithm, \lse, for computing a b-edgecover of minimum weight in a graph with weights on the edges. The b-edgecover problem is a generalization of the better-known Edge Cover problem in graphs, where the objective is to choose a subset C of edges in the graph such that at least a specified number b(v) of edges in C are incident on each vertex v. In the weighted b-edgecover problem, we minimize the sum of the weights of the edges in C. We prove that the Locally Subdominant edge (LSE) algorithm computes the same b-edge cover as the one obtained by the Greedy algorithm for the problem. However, the Greedy algorithm requires edges to be sorted by their effective weights, and these weights need to be updated after each iteration. These requirements make the Greedy algorithm sequential and impractical for massive graphs. The LSE algorithm avoids the sorting step, and is amenable for parallelization. We implement the algorithm on a serial machine and compare its performance against a collection of approximation algorithms for the b-edge cover problem. Our results show that the algorithm is 3 to 5 times faster than the Greedy algorithm on a serial processor. The approximate edge covers obtained by the LSE algorithm have weights greater by at most 17% of the optimal weight for problems where we could compute the latter. We also investigate the relationship between the b-edge cover and the b-matching problems, show that the latter has a faster implementation since edge weights are static in this algorithm, and obtain a heuristic solution for the former from the latter.more » « less
-
In this work, we present a novel method for constructing a topological map of biological hotspots in an aquatic environment using a Fast Marching-based Voronoi segmentation. Using this topological map, we develop a closed form solution to the scheduling problem for any single path through the graph. Searching over the space of all paths allows us to compute a maximally informative path that traverses a subset of the hotspots, given some budget. Using a greedy-coverage algorithm we can then compute an informative path. We evaluate our method in a set of simulated trials, both with randomly generated environments and a real-world environment. In these trials, we show that our method produces a topological graph which more accurately captures features in the environment than standard thresholding techniques. Additionally, We show that our method can improve the performance of a greedy-coverage algorithm in the informative path planning problem by guiding it to different informative areas to help it escape from local maxima.more » « less
-
We study the problem of sampling and reconstruction of bandlimited graph signals where the objective is to select a node subset of prescribed cardinality that ensures interpolation of the original signal with the lowest reconstruction error. We propose an efficient iterative selection sampling approach and show that in the noiseless case the original signal is exactly recovered from the set of selected nodes. In the case of noisy measurements, a bound on the reconstruction error of the proposed algorithm is established. We further address the support identification of the bandlimited signal with unknown support and show that under a pragmatic sufficient condition, the proposed framework requires minimal number of samples to perfectly identify the support. The efficacy of the proposed methods is illustrated through numerical simulations on synthetic and real-world graphs.more » « less
-
Motivated by the study of matrix elimination orderings in combinatorial scientific computing, we utilize graph sketching and local sampling to give a data structure that provides access to approximate fill degrees of a matrix undergoing elimination in O(polylog(n)) time per elimination and query. We then study the problem of using this data structure in the minimum degree algorithm, which is a widely-used heuristic for producing elimination orderings for sparse matrices by repeatedly eliminating the vertex with (approximate) minimum fill degree. This leads to a nearly-linear time algorithm for generating approximate greedy minimum degree orderings. Despite extensive studies of algorithms for elimination orderings in combinatorial scientific computing, our result is the first rigorous incorporation of randomized tools in this setting, as well as the first nearly-linear time algorithm for producing elimination orderings with provable approximation guarantees. While our sketching data structure readily works in the oblivious adversary model, by repeatedly querying and greedily updating itself, it enters the adaptive adversarial model where the underlying sketches become prone to failure due to dependency issues with their internal randomness. We show how to use an additional sampling procedure to circumvent this problem and to create an independent access sequence. Our technique for decorrelating the interleaved queries and updates to this randomized data structure may be of independent interest.more » « less
An official website of the United States government

