skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Covariance function versus covariance matrix estimation in efficient semi-parametric regression for longitudinal data analysis
Award ID(s):
2013486 1712418
PAR ID:
10454052
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Journal of Multivariate Analysis
Volume:
187
Issue:
C
ISSN:
0047-259X
Page Range / eLocation ID:
104900
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Oh, Alice; Agarwal, Alekh; Belgrave, Danielle; Cho, Kyunghyun (Ed.)
    Graph neural networks (GNN) are an effective framework that exploit inter-relationships within graph-structured data for learning. Principal component analysis (PCA) involves the projection of data on the eigenspace of the covariance matrix and draws similarities with the graph convolutional filters in GNNs. Motivated by this observation, we study a GNN architecture, called coVariance neural network (VNN), that operates on sample covariance matrices as graphs. We theoretically establish the stability of VNNs to perturbations in the covariance matrix, thus, implying an advantage over standard PCA-based data analysis approaches that are prone to instability due to principal components associated with close eigenvalues. Our experiments on real-world datasets validate our theoretical results and show that VNN performance is indeed more stable than PCA-based statistical approaches. Moreover, our experiments on multi-resolution datasets also demonstrate that VNNs are amenable to transferability of performance over covariance matrices of different dimensions; a feature that is infeasible for PCA-based approaches. 
    more » « less