Penalized principal logistic regression for sparse sufficient dimension reduction
- Authors
- Shin, Seung Jun; Artemiou, Andreas
- Issue Date
- 7월-2017
- Publisher
- ELSEVIER SCIENCE BV
- Keywords
- Max-SCAD penalty; Principal logistic regression; Sparse sufficient dimension reduction; Sufficient dimension reduction
- Citation
- COMPUTATIONAL STATISTICS & DATA ANALYSIS, v.111, pp.48 - 58
- Indexed
- SCIE
SCOPUS
- Journal Title
- COMPUTATIONAL STATISTICS & DATA ANALYSIS
- Volume
- 111
- Start Page
- 48
- End Page
- 58
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/82938
- DOI
- 10.1016/j.csda.2016.12.003
- ISSN
- 0167-9473
- Abstract
- Sufficient dimension reduction (SDR) is a successive tool for reducing the dimensionality of predictors by finding the central subspace, a minimal subspace of predictors that preserves all the regression information. When predictor dimension is large, it is often assumed that only a small number of predictors is informative. In this regard, sparse SDR is desired to achieve variable selection and dimension reduction simultaneously. We propose a principal logistic regression (PLR) as a new SDR tool and further develop its penalized version for sparse SDR. Asymptotic analysis shows that the penalized PLR enjoys the oracle property. Numerical investigation supports the advantageous performance of the proposed methods. (C) 2016 Elsevier B.V. All rights reserved.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Political Science & Economics > Department of Statistics > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.