Seminar: A RKHS Approach for Variable Selection in High Dimensional Functional Linear Models

Seminar: A RKHS Approach for Variable Selection in High Dimensional Functional Linear Models

Jul 13, 2021 - 2:10 PM
to , -

Abstract: There has been a recent surge in applications of high-dimensional functional data analysis. We explore functional linear regression by focusing on the large-scale scenario that scalar response is associated with potentially an ultra-large number of functional predictors in the setting of the reproducing kernel Hilbert space (RKHS) framework. 

Unlike the classical high-dimensional Euclidean data in \mathbb{R}^p, for high-dimensional functional data in the p-fold product L_2-space, the eigenvalues of the matrix-valued covariance operator shrink to zero due to the infinite dimensionality. Therefore, an \ell_2penalty, additional to \ell_1penalty, is added to regularize the LASSO, we call it functional elastic-net. We introduce the Karush-Kuhn-Tucker (KKT) conditions in function spaces, and show the unique solution of functional elastic-net exists. We provide sufficient conditions for establishing variable selection consistency. Our results show variable selection consistency of functional linear regression can be achieved for p \gg n. An efficient computational algorithm has been developed to solve the functional elastic-net problem. The simulation results show that our algorithm outperforms the existing methods in various high-dimensional settings.