Structured penalties for functional linear models-partially empirical eigenvectors for regression

Timothy W. Randolph, Jaroslaw Harezlak, Ziding Feng

Research output: Contribution to journalArticle

15 Citations (Scopus)

Abstract

One of the challenges with functional data is incorporating geometric structure, or local correlation, into the analysis. This structure is inherent in the output from an increasing number of biomedical technologies, and a functional linear model is often used to estimate the relationship between the predictor functions and scalar responses. Common approaches to the problem of estimating a coefficient function typically involve two stages: regularization and estimation. Regularization is usually done via dimension reduction, projecting onto a predefined span of basis functions or a reduced set of eigenvectors (principal components). In contrast, we present a unified approach that directly incorporates geometric structure into the estimation process by exploiting the joint eigenproperties of the predictors and a linear penalty operator. In this sense, the components in the regression are 'partially empirical' and the framework is provided by the generalized singular value decomposition (GSVD). The form of the penalized estimation is not new, but the GSVD clarifies the process and informs the choice of penalty by making explicit the joint influence of the penalty and predictors on the bias, variance and performance of the estimated coefficient function. Laboratory spectroscopy data and simulations are used to illustrate the concepts.

Original languageEnglish
Pages (from-to)323-353
Number of pages31
JournalElectronic Journal of Statistics
Volume6
DOIs
StatePublished - 2012

Fingerprint

Functional Linear Model
Generalized Singular Value Decomposition
Eigenvector
Penalty
Predictors
Regression
Geometric Structure
Regularization
Functional Data
Dimension Reduction
Coefficient
Principal Components
Basis Functions
Spectroscopy
Scalar
Output
Operator
Estimate
Partially linear model
Simulation

Keywords

  • Functional data
  • Generalized singular value decomposition
  • Penalized regression
  • Regularization

ASJC Scopus subject areas

  • Statistics and Probability

Cite this

Structured penalties for functional linear models-partially empirical eigenvectors for regression. / Randolph, Timothy W.; Harezlak, Jaroslaw; Feng, Ziding.

In: Electronic Journal of Statistics, Vol. 6, 2012, p. 323-353.

Research output: Contribution to journalArticle

@article{cff764c2fe7045bc8beda210a6738dce,
title = "Structured penalties for functional linear models-partially empirical eigenvectors for regression",
abstract = "One of the challenges with functional data is incorporating geometric structure, or local correlation, into the analysis. This structure is inherent in the output from an increasing number of biomedical technologies, and a functional linear model is often used to estimate the relationship between the predictor functions and scalar responses. Common approaches to the problem of estimating a coefficient function typically involve two stages: regularization and estimation. Regularization is usually done via dimension reduction, projecting onto a predefined span of basis functions or a reduced set of eigenvectors (principal components). In contrast, we present a unified approach that directly incorporates geometric structure into the estimation process by exploiting the joint eigenproperties of the predictors and a linear penalty operator. In this sense, the components in the regression are 'partially empirical' and the framework is provided by the generalized singular value decomposition (GSVD). The form of the penalized estimation is not new, but the GSVD clarifies the process and informs the choice of penalty by making explicit the joint influence of the penalty and predictors on the bias, variance and performance of the estimated coefficient function. Laboratory spectroscopy data and simulations are used to illustrate the concepts.",
keywords = "Functional data, Generalized singular value decomposition, Penalized regression, Regularization",
author = "Randolph, {Timothy W.} and Jaroslaw Harezlak and Ziding Feng",
year = "2012",
doi = "10.1214/12-EJS676",
language = "English",
volume = "6",
pages = "323--353",
journal = "Electronic Journal of Statistics",
issn = "1935-7524",
publisher = "Institute of Mathematical Statistics",

}

TY - JOUR

T1 - Structured penalties for functional linear models-partially empirical eigenvectors for regression

AU - Randolph, Timothy W.

AU - Harezlak, Jaroslaw

AU - Feng, Ziding

PY - 2012

Y1 - 2012

N2 - One of the challenges with functional data is incorporating geometric structure, or local correlation, into the analysis. This structure is inherent in the output from an increasing number of biomedical technologies, and a functional linear model is often used to estimate the relationship between the predictor functions and scalar responses. Common approaches to the problem of estimating a coefficient function typically involve two stages: regularization and estimation. Regularization is usually done via dimension reduction, projecting onto a predefined span of basis functions or a reduced set of eigenvectors (principal components). In contrast, we present a unified approach that directly incorporates geometric structure into the estimation process by exploiting the joint eigenproperties of the predictors and a linear penalty operator. In this sense, the components in the regression are 'partially empirical' and the framework is provided by the generalized singular value decomposition (GSVD). The form of the penalized estimation is not new, but the GSVD clarifies the process and informs the choice of penalty by making explicit the joint influence of the penalty and predictors on the bias, variance and performance of the estimated coefficient function. Laboratory spectroscopy data and simulations are used to illustrate the concepts.

AB - One of the challenges with functional data is incorporating geometric structure, or local correlation, into the analysis. This structure is inherent in the output from an increasing number of biomedical technologies, and a functional linear model is often used to estimate the relationship between the predictor functions and scalar responses. Common approaches to the problem of estimating a coefficient function typically involve two stages: regularization and estimation. Regularization is usually done via dimension reduction, projecting onto a predefined span of basis functions or a reduced set of eigenvectors (principal components). In contrast, we present a unified approach that directly incorporates geometric structure into the estimation process by exploiting the joint eigenproperties of the predictors and a linear penalty operator. In this sense, the components in the regression are 'partially empirical' and the framework is provided by the generalized singular value decomposition (GSVD). The form of the penalized estimation is not new, but the GSVD clarifies the process and informs the choice of penalty by making explicit the joint influence of the penalty and predictors on the bias, variance and performance of the estimated coefficient function. Laboratory spectroscopy data and simulations are used to illustrate the concepts.

KW - Functional data

KW - Generalized singular value decomposition

KW - Penalized regression

KW - Regularization

UR - http://www.scopus.com/inward/record.url?scp=84875393952&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84875393952&partnerID=8YFLogxK

U2 - 10.1214/12-EJS676

DO - 10.1214/12-EJS676

M3 - Article

VL - 6

SP - 323

EP - 353

JO - Electronic Journal of Statistics

JF - Electronic Journal of Statistics

SN - 1935-7524

ER -