Task-specific feature extraction and classification of fMRI volumes using a deep neural network initialized with a deep belief network: Evaluation using sensorimotor tasks
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Jang, Hojin | - |
dc.contributor.author | Plis, Sergey M. | - |
dc.contributor.author | Calhoun, Vince D. | - |
dc.contributor.author | Lee, Jong-Hwan | - |
dc.date.accessioned | 2021-09-03T10:54:34Z | - |
dc.date.available | 2021-09-03T10:54:34Z | - |
dc.date.created | 2021-06-16 | - |
dc.date.issued | 2017-01-15 | - |
dc.identifier.issn | 1053-8119 | - |
dc.identifier.uri | https://scholar.korea.ac.kr/handle/2021.sw.korea/84911 | - |
dc.description.abstract | Feedforward deep neural networks (DNNs), artificial neural networks with multiple hidden layers, have recently demonstrated a record-breaking performance in multiple areas of applications in computer vision and speech processing. Following the success, DNNs have been applied to neuroimaging modalities including functional/structural magnetic resonance imaging (MRI) and positron-emission tomography data. However, no study has explicitly applied DNNs to 3D whole-brain fMRI volumes and thereby extracted hidden volumetric representations of fMRI that are discriminative for a task performed as the fMRI volume was acquired. Our study applied fully connected feedforward DNN to fMRI volumes collected in four sensorimotor tasks (i.e., left-hand clenching, right-hand clenching, auditory attention, and visual stimulus) undertaken by 12 healthy participants. Using a leave-one-subject-out cross-validation scheme, a restricted Boltzmann machine-based deep belief network was pretrained and used to initialize weights of the DNN. The pretrained DNN was fine-tuned while systematically controlling weight-sparsity levels across hidden layers. Optimal weight-sparsity levels were determined from a minimum validation error rate of fMRI volume classification. Minimum error rates (mean standard deviation; %) of 6.9 (+/- 3.8) were obtained from the three-layer DNN with the sparsest condition of weights across the three hidden layers. These error rates were even lower than the error rates from the single-layer network (9.4 +/- 4.6) and the two-layer network (7.4 +/- 4.1). The estimated DNN weights showed spatial patterns that are remarkably task-specific, particularly in the higher layers. The output values of the third hidden layer represented distinct patterns/codes of the 3D whole-brain fMRI volume and encoded the information of the tasks as evaluated from representational similarity analysis. Our reported findings show the ability of the DNN to classify a single fMRI volume based on the extraction of hidden representations of fMRI volumes associated with tasks across multiple hidden layers. Our study may be beneficial to the automatic classification/diagnosis of neuropsychiatric and neurological diseases and prediction of disease severity and recovery in (pre-) clinical settings using fMRI volumes without requiring an estimation of activation patterns or ad hoc statistical evaluation. (C) 2016 Elsevier Inc. All rights reserved. | - |
dc.language | English | - |
dc.language.iso | en | - |
dc.publisher | ACADEMIC PRESS INC ELSEVIER SCIENCE | - |
dc.subject | INDEPENDENT COMPONENT ANALYSIS | - |
dc.subject | HUMAN BRAIN | - |
dc.subject | FUNCTIONAL CONNECTIVITY | - |
dc.subject | NATURAL IMAGES | - |
dc.subject | RECONSTRUCTION | - |
dc.subject | REPRESENTATIONS | - |
dc.subject | PERFORMANCE | - |
dc.subject | MACHINES | - |
dc.subject | PATTERNS | - |
dc.subject | SUBJECT | - |
dc.title | Task-specific feature extraction and classification of fMRI volumes using a deep neural network initialized with a deep belief network: Evaluation using sensorimotor tasks | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Lee, Jong-Hwan | - |
dc.identifier.doi | 10.1016/j.neuroimage.2016.04.003 | - |
dc.identifier.scopusid | 2-s2.0-85006868220 | - |
dc.identifier.wosid | 000390976200016 | - |
dc.identifier.bibliographicCitation | NEUROIMAGE, v.145, pp.314 - 328 | - |
dc.relation.isPartOf | NEUROIMAGE | - |
dc.citation.title | NEUROIMAGE | - |
dc.citation.volume | 145 | - |
dc.citation.startPage | 314 | - |
dc.citation.endPage | 328 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Neurosciences & Neurology | - |
dc.relation.journalResearchArea | Radiology, Nuclear Medicine & Medical Imaging | - |
dc.relation.journalWebOfScienceCategory | Neurosciences | - |
dc.relation.journalWebOfScienceCategory | Neuroimaging | - |
dc.relation.journalWebOfScienceCategory | Radiology, Nuclear Medicine & Medical Imaging | - |
dc.subject.keywordPlus | INDEPENDENT COMPONENT ANALYSIS | - |
dc.subject.keywordPlus | HUMAN BRAIN | - |
dc.subject.keywordPlus | FUNCTIONAL CONNECTIVITY | - |
dc.subject.keywordPlus | NATURAL IMAGES | - |
dc.subject.keywordPlus | RECONSTRUCTION | - |
dc.subject.keywordPlus | REPRESENTATIONS | - |
dc.subject.keywordPlus | PERFORMANCE | - |
dc.subject.keywordPlus | MACHINES | - |
dc.subject.keywordPlus | PATTERNS | - |
dc.subject.keywordPlus | SUBJECT | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(02841) 서울특별시 성북구 안암로 14502-3290-1114
COPYRIGHT © 2021 Korea University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.