Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

INCREMENTAL SPARSE PSEUDO-INPUT GAUSSIAN PROCESS REGRESSION

Full metadata record
DC Field Value Language
dc.contributor.authorSuk, Heung-Il-
dc.contributor.authorWang, Yuzhuo-
dc.contributor.authorLee, Seong-Whan-
dc.date.accessioned2021-09-06T12:20:09Z-
dc.date.available2021-09-06T12:20:09Z-
dc.date.created2021-06-14-
dc.date.issued2012-12-
dc.identifier.issn0218-0014-
dc.identifier.urihttps://scholar.korea.ac.kr/handle/2021.sw.korea/106727-
dc.description.abstractIn this paper, we devise a novel method that incrementally learns pseudo-data, which represent the whole training data set for Gaussian Process (GP) regression. The method involves sparse approximation of the GP by extending the work of Snelson and Ghahramani. We call the proposed method Incremental Sparse Pseudo-input Gaussian Process (ISPGP) regression. Unlike the Snelson and Ghahramani's work, the proposed ISPGP algorithm allows for training from either a huge amount of training data by scanning through it only once or an online incremental training data set. We also design a likelihood weighting scheme to incrementally determine pseudo-data while maintaining the representational power. Due to the nature of the incremental learning algorithm, the proposed ISPGP algorithm can theoretically work with infinite data to which the conventional GP or Sparse Pseudo-input Gaussian Process ( SPGP) algorithm is not applicable. From our experimental results on the KIN40K data set, we can see that the proposed ISPGP algorithm is comparable to the conventional GP algorithm using the same number of training data. It also significantly reduces the computational cost and memory requirement in regression and is scalable to a large training data set without significant performance degradation. Although the proposed ISPGP algorithm performs slightly worse than Snelson and Ghahramani's SPGP algorithm, the level of performance degradation is acceptable.-
dc.languageEnglish-
dc.language.isoen-
dc.publisherWORLD SCIENTIFIC PUBL CO PTE LTD-
dc.subjectGREEDY-
dc.titleINCREMENTAL SPARSE PSEUDO-INPUT GAUSSIAN PROCESS REGRESSION-
dc.typeArticle-
dc.contributor.affiliatedAuthorSuk, Heung-Il-
dc.contributor.affiliatedAuthorLee, Seong-Whan-
dc.identifier.doi10.1142/S021800141250019X-
dc.identifier.scopusid2-s2.0-84874398771-
dc.identifier.wosid000315523100001-
dc.identifier.bibliographicCitationINTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, v.26, no.8-
dc.relation.isPartOfINTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE-
dc.citation.titleINTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE-
dc.citation.volume26-
dc.citation.number8-
dc.type.rimsART-
dc.type.docTypeArticle-
dc.description.journalClass1-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.subject.keywordPlusGREEDY-
dc.subject.keywordAuthorGaussian process regression-
dc.subject.keywordAuthorincremental learning-
dc.subject.keywordAuthorpseudo-data-
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School > Department of Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lee, Seong Whan photo

Lee, Seong Whan
인공지능학과
Read more

Altmetrics

Total Views & Downloads

BROWSE