Classical results in sparse representation guarantee
the exact recovery of sparse signals under assumptions on
the dictionary that are either too strong or NP hard to check.
Moreover, such results may be too pessimistic in practice since
they are based on a worstcase analysis. In this paper, we
consider the sparse recovery of signals defined over a graph,
for which the dictionary takes the form of an incidence matrix.
We show that in this case necessary and sufficient conditions
can be derived in terms of properties of the cycles of the
graph, which can be checked in polynomial time. Our analysis
further allows us to derive location dependent conditions for
recovery that only depend on the cycles of the graph that
intersect this support. Finally, we exploit sparsity properties on
the measurements to a specialized subgraphbased recovery
algorithm that outperforms the standard $l_1$minimization.
more »
« less
Sparse Recovery over Graph Incidence Matrices
Classical results in sparse recovery guarantee the exact reconstruction of ssparse signals under assumptions on the dictionary that are either too strong or NPhard to check. Moreover, such results may be pessimistic in practice since they are based on a worstcase analysis. In this paper, we consider the sparse recovery of signals defined over a graph, for which the dictionary takes the form of an incidence matrix. We derive necessary and sufficient conditions for sparse recovery, which depend on properties of the cycles of the graph that can be checked in polynomial time. We also derive supportdependent conditions for sparse recovery that depend only on the intersection of the cycles of the graph with the support of the signal. Finally, we exploit sparsity properties on the measurements and the structure of incidence matrices to propose a specialized subgraphbased recovery algorithm that outperforms the standard l1 minimization approach.
more »
« less
 Award ID(s):
 1736448
 NSFPAR ID:
 10106016
 Date Published:
 Journal Name:
 Proceedings of the IEEE Conference on Decision & Control
 Page Range / eLocation ID:
 364 to 371
 Format(s):
 Medium: X
 Sponsoring Org:
 National Science Foundation
More Like this


null (Ed.)We develop online graph learning algorithms from streaming network data. Our goal is to track the (possibly) timevarying network topology, and affect memory and computational savings by processing the data onthefly as they are acquired. The setup entails observations modeled as stationary graph signals generated by local diffusion dynamics on the unknown network. Moreover, we may have a priori information on the presence or absence of a few edges as in the link prediction problem. The stationarity assumption implies that the observations’ covariance matrix and the socalled graph shift operator (GSO—a matrix encoding the graph topology) commute under mild requirements. This motivates formulating the topology inference task as an inverse problem, whereby one searches for a sparse GSO that is structurally admissible and approximately commutes with the observations’ empirical covariance matrix. For streaming data, said covariance can be updated recursively, and we show online proximal gradient iterations can be brought to bear to efficiently track the timevarying solution of the inverse problem with quantifiable guarantees. Specifically, we derive conditions under which the GSO recovery cost is strongly convex and use this property to prove that the online algorithm converges to within a neighborhood of the optimal timevarying batch solution. Numerical tests illustrate the effectiveness of the proposed graph learning approach in adapting to streaming information and tracking changes in the sought dynamic network.more » « less

null (Ed.)We leverage proximal gradient iterations to develop an online graph learning algorithm from streaming network data. Our goal is to track the (possibly) timevarying network topology, and effect memory and computational savings by processing the data onthefly as they are acquired. The setup entails observations modeled as stationary graph signals generated by local diffusion dynamics on the unknown network. Moreover, we may have a priori information on the presence or absence of a few edges as in the link prediction problem. The stationarity assumption implies that the observations' covariance matrix and the socalled graph shift operator (GSO  a matrix encoding the graph topology) commute under mild requirements. This motivates formulating the topology inference task as an inverse problem, whereby one searches for a (e.g., sparse) GSO that is structurally admissible and approximately commutes with the observations' empirical covariance matrix. For streaming data said covariance can be updated recursively, and we show online proximal gradient iterations can be brought to bear to efficiently track the timevarying solution of the inverse problem with quantifiable guarantees. Specifically, we derive conditions under which the GSO recovery cost is strongly convex and use this property to prove that the online algorithm converges to within a neighborhood of the optimal timevarying batch solution. Preliminary numerical tests illustrate the effectiveness of the proposed graph learning approach in adapting to streaming information and tracking changes in the sought dynamic network.more » « less

Inferring graph structure from observations on the nodes is an important and popular network science task. Departing from the more common inference of a single graph, we study the problem of jointly inferring multiple graphs from the observation of signals at their nodes (graph signals), which are assumed to be stationary in the sought graphs. Graph stationarity implies that the mapping between the covariance of the signals and the sparse matrix representing the underlying graph is given by a matrix polynomial. A prominent example is that of Markov random fields, where the inverse of the covariance yields the sparse matrix of interest. From a modeling perspective, stationary graph signals can be used to model linear network processes evolving on a set of (not necessarily known) networks. Leveraging that matrix polynomials commute, a convex optimization method along with sufficient conditions that guarantee the recovery of the true graphs are provided when perfect covariance information is available. Particularly important from an empirical viewpoint, we provide highprobability bounds on the recovery error as a function of the number of signals observed and other key problem parameters. Numerical experiments demonstrate the effectiveness of the proposed method with perfect covariance information as well as its robustness in the noisy regime.more » « less

We use a multidimensional signal representation that inte grates diffusion Magnetic Resonance Imaging (dMRI) and tractography (brain connections) using sparse tensor decomposition. The representa tion encodes brain connections (fibers) into a verylarge, but sparse, core tensor and allows to predict dMRI measurements based on a dictionary of diffusion signals. We propose an algorithm to learn the constituent parts of the model from a dataset. The algorithm assumes a tractography model (support of core tensor) and iteratively minimizes the Frobenius norm of the error as a function of the dictionary atoms, the values of nonzero entries in the sparse core tensor and the fiber weights. We use a nonparametric dictionary learning (DL) approach to estimate signal atoms. Moreover, the algorithm is able to learn multiple dictionaries associated to different brain locations (voxels) allowing for mapping distinctive tissue types. We illustrate the algorithm through results obtained on a large invivo highresolution dataset.more » « less