Wasserstein Stationary Subspace Analysis
- Authors
- Kaltenstadler, Stephan; Nakajima, Shinichi; Mueller, Klaus-Robert; Samek, Wojciech
- Issue Date
- 12월-2018
- Publisher
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Keywords
- Subspace learning; stationary subspace analysis; divergence methods; optimal transport; covariance metrics
- Citation
- IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, v.12, no.6, pp.1213 - 1223
- Indexed
- SCIE
SCOPUS
- Journal Title
- IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING
- Volume
- 12
- Number
- 6
- Start Page
- 1213
- End Page
- 1223
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/71379
- DOI
- 10.1109/JSTSP.2018.2873987
- ISSN
- 1932-4553
- Abstract
- Learning under nonstationarity can be achieved by decomposing the data into a subspace that is stationary and a nonstationary one [stationary subspace analysis (SSA)]. While SSA has been used in various applications, its robustness and computational efficiency have limits due to the difficulty in optimizing the Kullback-Leibler divergence based objective. In this paper, we contribute by extending SSA twofold: we propose SSA with 1) higher numerical efficiency by defining analytical SSA variants and 2) higher robustness by utilizing the Wasserstein-2 distance (Wasserstein SSA). We show the usefulness of our novel algorithms for toy data demonstrating their mathematical properties and for real-world data 1) allowing better segmentation of time series and 2) brain-computer interfacing, where theWasserstein-based measure of nonstationarity is used for spatial filter regularization and gives rise to higher decoding performance.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - Graduate School > Department of Artificial Intelligence > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.