skip to main content


Title: Estimation and Inference for Functional Linear Regression Models with Partially Varying Regression Coefficients
Award ID(s):
1736470
NSF-PAR ID:
10158609
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
Stat
ISSN:
2049-1573
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Quantile regression for right‐ or left‐censored outcomes has attracted attention due to its ability to accommodate heterogeneity in regression analysis of survival times. Rank‐based inferential methods have desirable properties for quantile regression analysis, but censored data poses challenges to the general concept of ranking. In this article, we propose a notion of censored quantile regression rank scores, which enables us to construct rank‐based tests for quantile regression coefficients at a single quantile or over a quantile region. A model‐based bootstrap algorithm is proposed to implement the tests. We also illustrate the advantage of focusing on a quantile region instead of a single quantile level when testing the effect of certain covariates in a quantile regression framework.

     
    more » « less
  2. null (Ed.)
    Abstract Background Ridge regression is a regularization technique that penalizes the L2-norm of the coefficients in linear regression. One of the challenges of using ridge regression is the need to set a hyperparameter (α) that controls the amount of regularization. Cross-validation is typically used to select the best α from a set of candidates. However, efficient and appropriate selection of α can be challenging. This becomes prohibitive when large amounts of data are analyzed. Because the selected α depends on the scale of the data and correlations across predictors, it is also not straightforwardly interpretable. Results The present work addresses these challenges through a novel approach to ridge regression. We propose to reparameterize ridge regression in terms of the ratio γ between the L2-norms of the regularized and unregularized coefficients. We provide an algorithm that efficiently implements this approach, called fractional ridge regression, as well as open-source software implementations in Python and matlab (https://github.com/nrdg/fracridge). We show that the proposed method is fast and scalable for large-scale data problems. In brain imaging data, we demonstrate that this approach delivers results that are straightforward to interpret and compare across models and datasets. Conclusion Fractional ridge regression has several benefits: the solutions obtained for different γ are guaranteed to vary, guarding against wasted calculations; and automatically span the relevant range of regularization, avoiding the need for arduous manual exploration. These properties make fractional ridge regression particularly suitable for analysis of large complex datasets. 
    more » « less