skip to main content


Title: Message passing on networks with loops
Message passing is a fundamental technique for performing calculations on networks and graphs with applications in physics, computer science, statistics, and machine learning, including Bayesian inference, spin models, satisfiability, graph partitioning, network epidemiology, and the calculation of matrix eigenvalues. Despite its wide use, however, it has long been recognized that the method has a fundamental flaw: It works poorly on networks that contain short loops. Loops introduce correlations that can cause the method to give inaccurate answers or to fail completely in the worst cases. Unfortunately, most real-world networks contain many short loops, which limits the usefulness of the message-passing approach. In this paper we demonstrate how to rectify this shortcoming and create message-passing methods that work on any network. We give 2 example applications, one to the percolation properties of networks and the other to the calculation of the spectra of sparse matrices.  more » « less
Award ID(s):
1710848
NSF-PAR ID:
10184781
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Proceedings of the National Academy of Sciences
Volume:
116
Issue:
47
ISSN:
0027-8424
Page Range / eLocation ID:
23398 to 23403
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Belief propagation is a widely used message passing method for the solution of probabilistic models on networks such as epidemic models, spin models, and Bayesian graphical models, but it suffers from the serious shortcoming that it works poorly in the common case of networks that contain short loops. Here, we provide a solution to this long-standing problem, deriving a belief propagation method that allows for fast calculation of probability distributions in systems with short loops, potentially with high density, as well as giving expressions for the entropy and partition function, which are notoriously difficult quantities to compute. Using the Ising model as an example, we show that our approach gives excellent results on both real and synthetic networks, improving substantially on standard message passing methods. We also discuss potential applications of our method to a variety of other problems. 
    more » « less
  2. Networks and network computations have become a primary mathematical tool for analyzing the structure of many kinds of complex systems, ranging from the Internet and transportation networks to biochemical interactions and social networks. A common task in network analysis is the calculation of quantities that reside on the nodes of a network, such as centrality measures, probabilities or model states. In this perspective article we discuss message passing methods, a family of techniques for performing such calculations, based on the propagation of information between the nodes of a network. We introduce the message passing approach with a series of examples, give some illustrative applications and results and discuss the deep connections between message passing and phase transitions in networks. We also point out some limitations of the message passing approach and describe some recently introduced methods that address these limitations. 
    more » « less
  3. Demeniconi, C. ; Davidson, I (Ed.)
    Many irregular domains such as social networks, financial transactions, neuron connections, and natural language constructs are represented using graph structures. In recent years, a variety of graph neural networks (GNNs) have been successfully applied for representation learning and prediction on such graphs. In many of the real-world applications, the underlying graph changes over time, however, most of the existing GNNs are inadequate for handling such dynamic graphs. In this paper we propose a novel technique for learning embeddings of dynamic graphs using a tensor algebra framework. Our method extends the popular graph convolutional network (GCN) for learning representations of dynamic graphs using the recently proposed tensor M-product technique. Theoretical results presented establish a connection between the proposed tensor approach and spectral convolution of tensors. The proposed method TM-GCN is consistent with the Message Passing Neural Network (MPNN) framework, accounting for both spatial and temporal message passing. Numerical experiments on real-world datasets demonstrate the performance of the proposed method for edge classification and link prediction tasks on dynamic graphs. We also consider an application related to the COVID-19 pandemic, and show how our method can be used for early detection of infected individuals from contact tracing data. 
    more » « less
  4. The globalization of the IC supply chain has raised many security threats, especially when untrusted parties are involved. This has created a demand for a dependable logic obfuscation solution to combat these threats. Amongst a wide range of threats and countermeasures on logic obfuscation in the 2010s decade, the Boolean satisfiability (SAT) attack, or one of its derivatives, could break almost all state-of-the-art logic obfuscation countermeasures. However, in some cases, particularly when the logic locked circuits contain complex structures, such as big multipliers, large routing networks, or big tree structures, the logic locked circuit is hard-to-be-solved for the SAT attack. Usage of these structures for obfuscation may lead a strong defense, as many SAT solvers fail to handle such complexity. However, in this paper, we propose a neural-network-guided SAT attack (NNgSAT), in which we examine the capability and effectiveness of a message-passing neural network (MPNN) for solving these complex structures (SAT-hard instances). In NNgSAT, after being trained as a classifier to predict SAT/UNSAT on a SAT problem (NN serves as a SAT solver), the neural network is used to guide/help the actual SAT solver for finding the SAT assignment(s). By training NN on conjunctive normal forms (CNFs) corresponded to a dataset of logic locked circuits, as well as fine-tuning the confidence rate of the NN prediction, our experiments show that NNgSAT could solve 93.5% of the logic locked circuits containing complex structures within a reasonable time, while the existing SAT attack cannot proceed the attack flow in them. 
    more » « less
  5. Abstract

    The global connectivities in very large protein similarity networks contain traces of evolution among the proteins for detecting protein remote evolutionary relations or structural similarities. To investigate how well a protein network captures the evolutionary information, a key limitation is the intensive computation of pairwise sequence similarities needed to construct very large protein networks. In this article, we introduce label propagation on low‐rank kernel approximation (LP‐LOKA) for searching massively large protein networks. LP‐LOKA propagates initial protein similarities in a low‐rank graph by Nyström approximation without computing all pairwise similarities. With scalable parallel implementations based on distributed‐memory using message‐passing interface and Apache‐Hadoop/Spark on cloud, LP‐LOKA can search protein networks with one million proteins or more. In the experiments on Swiss‐Prot/ADDA/CASP data, LP‐LOKA significantly improved protein ranking over the widely used HMM‐HMM or profile‐sequence alignment methods utilizing large protein networks. It was observed that the larger the protein similarity network, the better the performance, especially on relatively small protein superfamilies and folds. The results suggest that computing massively large protein network is necessary to meet the growing need of annotating proteins from newly sequenced species and LP‐LOKA is both scalable and accurate for searching massively large protein networks.

     
    more » « less