skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: How Fast Will You Get a Response? Predicting Interval Time for Reciprocal Link Creation
In the recent years, reciprocal link prediction has received some attention from the data mining and social network analysis researchers, who solved this problem as a binary classification task. However, it is also important to predict the interval time for the creation of reciprocal link. This is a challenging problem for two reasons: First, the lack of effective features, because well-known link prediction features are designed for undirected networks and for the binary classification task, hence they do not work well for the interval time prediction; Second, the presence of censored data instances makes the traditional supervised regression methods unsuitable for solving this problem. In this paper, we propose a solution for the reciprocal link interval time prediction task. We map this problem into survival analysis framework and show through extensive experiments on real-world datasets that, survival analysis methods perform better than traditional regression, neural network based model and support vector regression (SVR).  more » « less
Award ID(s):
1707498 1646881 1619028
PAR ID:
10048632
Author(s) / Creator(s):
Date Published:
Journal Name:
Proceedings of the Eleventh International Conference on Web and Social Media (ICWSM)
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Predicting the occurrence of a particular event of interest at future time points is the primary goal of survival analysis. The presence of incomplete observations due to time limitations or loss of data traces is known as censoring which brings unique challenges in this domain and differentiates survival analysis from other standard regression methods. The popularly used survival analysis methods such as Cox proportional hazard model and parametric survival regression suffer from some strict assumptions and hypotheses that are not realistic in most of the real-world applications. To overcome the weaknesses of these two types of methods, in this paper, we reformulate the survival analysis problem as a multi-task learning problem and propose a new multi-task learning based formulation to predict the survival time by estimating the survival status at each time interval during the study duration. We propose an indicator matrix to enable the multi-task learning algorithm to handle censored instances and incorporate some of the important characteristics of survival problems such as non-negative non-increasing list structure into our model through max-heap projection. We employ the L2,1-norm penalty which enables the model to learn a shared representation across related tasks and hence select important features and alleviate over-fitting in high-dimensional feature spaces; thus, reducing the prediction error of each task. To efficiently handle the two non-smooth constraints, in this paper, we propose an optimization method which employs Alternating Direction Method of Multipliers (ADMM) algorithm to solve the proposed multi-task learning problem. We demonstrate the performance of the proposed method using real-world microarray gene expression high-dimensional benchmark datasets and show that our method outperforms state-of-the-art methods. 
    more » « less
  2. Objectives: This paper introduces a novel method for the detection and classification of aortic stenosis (AS) using the time-frequency features of chest cardio-mechanical signals collected from wearable sensors, namely seismo-cardiogram (SCG) and gyro-cardiogram (GCG) signals. Such a method could potentially monitor high-risk patients out of the clinic. Methods: Experimental measurements were collected from twenty patients with AS and twenty healthy subjects. Firstly, a digital signal processing framework is proposed to extract time-frequency features. The features are then selected via the analysis of variance test. Different combinations of features are evaluated using the decision tree, random forest, and artificial neural network methods. Two classification tasks are conducted. The first task is a binary classification between normal subjects and AS patients. The second task is a multi-class classification of AS patients with co-existing valvular heart diseases. Results: In the binary classification task, the average accuracies achieved are 96.25% from decision tree, 97.43% from random forest, and 95.56% from neural network. The best performance is from combined SCG and GCG features with random forest classifier. In the multi-class classification, the best performance is 92.99% using the random forest classifier and SCG features. Conclusion: The results suggest that the solution could be a feasible method for classifying aortic stenosis, both in the binary and multi-class tasks. It also indicates that most of the important time-frequency features are below 11 Hz. Significance: The proposed method shows great potential to provide continuous monitoring of valvular heart diseases to prevent patients from sudden critical cardiac situations. 
    more » « less
  3. Machine learning frameworks such as graph neural networks typically rely on a given, fixed graph to exploit relational inductive biases and thus effectively learn from network data. However, when said graphs are (partially) unobserved, noisy, or dynamic, the problem of inferring graph structure from data becomes relevant. In this paper, we postulate a graph convolutional relationship between the observed and latent graphs, and formulate the graph structure learning task as a network inverse (deconvolution) problem. In lieu of eigendecomposition-based spectral methods or iterative optimization solutions, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN). GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive as well as node permutation equivariant. We corroborate GDN’s superior graph learning performance and its generalization to larger graphs using synthetic data in supervised settings. Moreover, we demonstrate the robustness and representation power of GDNs on real world neuroimaging and social network datasets. 
    more » « less
  4. We introduce a novel check-in time prediction problem. The goal is to predict the time a user will check-in to a given location. We formulate check-in prediction as a survival analysis problem and propose a Recurrent-Censored Regression (RCR) model. We address the key challenge of check-in data scarcity, which is due to the uneven distribution of check-ins among users/locations. Our idea is to enrich the check-in data with potential visitors, i.e., users who have not visited the location before but are likely to do so. RCR uses recurrent neural network to learn latent representations from historical check-ins of both actual and potential visitors, which is then incorporated with censored regression to make predictions. Experiments show RCR outperforms state-of-the-art event time prediction techniques on real-world datasets. 
    more » « less
  5. We introduce Hyperdimensional Graph Learner (HDGL), a novel method for node classification and link prediction in graphs. HDGL maps node features into a very high-dimensional space (hyperdimensional or HD space for short) using the injectivity property of node representations in a family of Graph Neural Networks (GNNs) and then uses HD operators such as bundling and binding to aggregate information from the local neighborhood of each node yielding latent node representations that can support both node classification and link prediction tasks. HDGL, unlike GNNs that rely on computationally expensive iterative optimization and hyperparameter tuning, requires only a single pass through the data set. We report results of experiments using widely used benchmark datasets which demonstrate that, on the node classification task, HDGL achieves accuracy that is competitive with that of the state-of-the-art GNN methods at substantially reduced computational cost; and on the link prediction task, HDGL matches the performance of DeepWalk and related methods, although it falls short of computationally demanding state-of-the-art GNNs. 
    more » « less