FCN Based Label Correction for Multi-Atlas Guided Organ Segmentation
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Zhu, Hancan | - |
dc.contributor.author | Adeli, Ehsan | - |
dc.contributor.author | Shi, Feng | - |
dc.contributor.author | Shen, Dinggang | - |
dc.date.accessioned | 2021-08-31T04:55:04Z | - |
dc.date.available | 2021-08-31T04:55:04Z | - |
dc.date.created | 2021-06-18 | - |
dc.date.issued | 2020-04 | - |
dc.identifier.issn | 1539-2791 | - |
dc.identifier.uri | https://scholar.korea.ac.kr/handle/2021.sw.korea/56828 | - |
dc.description.abstract | Segmentation of medical images using multiple atlases has recently gained immense attention due to their augmented robustness against variabilities across different subjects. These atlas-based methods typically comprise of three steps: atlas selection, image registration, and finally label fusion. Image registration is one of the core steps in this process, accuracy of which directly affects the final labeling performance. However, due to inter-subject anatomical variations, registration errors are inevitable. The aim of this paper is to develop a deep learning-based confidence estimation method to alleviate the potential effects of registration errors. We first propose a fully convolutional network (FCN) with residual connections to learn the relationship between the image patch pair (i.e., patches from the target subject and the atlas) and the related label confidence patch. With the obtained label confidence patch, we can identify the potential errors in the warped atlas labels and correct them. Then, we use two label fusion methods to fuse the corrected atlas labels. The proposed methods are validated on a publicly available dataset for hippocampus segmentation. Experimental results demonstrate that our proposed methods outperform the state-of-the-art segmentation methods. | - |
dc.language | English | - |
dc.language.iso | en | - |
dc.publisher | HUMANA PRESS INC | - |
dc.subject | SPATIALLY VARYING PERFORMANCE | - |
dc.subject | IMAGE SEGMENTATION | - |
dc.subject | HIPPOCAMPAL SEGMENTATION | - |
dc.subject | FUSION | - |
dc.subject | REGISTRATION | - |
dc.subject | STRATEGIES | - |
dc.subject | PARAMETERS | - |
dc.subject | SELECTION | - |
dc.subject | MODEL | - |
dc.subject | TRUTH | - |
dc.title | FCN Based Label Correction for Multi-Atlas Guided Organ Segmentation | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Shen, Dinggang | - |
dc.identifier.doi | 10.1007/s12021-019-09448-5 | - |
dc.identifier.wosid | 000505327800001 | - |
dc.identifier.bibliographicCitation | NEUROINFORMATICS, v.18, no.2, pp.319 - 331 | - |
dc.relation.isPartOf | NEUROINFORMATICS | - |
dc.citation.title | NEUROINFORMATICS | - |
dc.citation.volume | 18 | - |
dc.citation.number | 2 | - |
dc.citation.startPage | 319 | - |
dc.citation.endPage | 331 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Computer Science | - |
dc.relation.journalResearchArea | Neurosciences & Neurology | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Interdisciplinary Applications | - |
dc.relation.journalWebOfScienceCategory | Neurosciences | - |
dc.subject.keywordPlus | SPATIALLY VARYING PERFORMANCE | - |
dc.subject.keywordPlus | IMAGE SEGMENTATION | - |
dc.subject.keywordPlus | HIPPOCAMPAL SEGMENTATION | - |
dc.subject.keywordPlus | FUSION | - |
dc.subject.keywordPlus | REGISTRATION | - |
dc.subject.keywordPlus | STRATEGIES | - |
dc.subject.keywordPlus | PARAMETERS | - |
dc.subject.keywordPlus | SELECTION | - |
dc.subject.keywordPlus | MODEL | - |
dc.subject.keywordPlus | TRUTH | - |
dc.subject.keywordAuthor | Multi-atlas image segmentation | - |
dc.subject.keywordAuthor | Label fusion | - |
dc.subject.keywordAuthor | Fully convolutional network | - |
dc.subject.keywordAuthor | Deep learning | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(02841) 서울특별시 성북구 안암로 14502-3290-1114
COPYRIGHT © 2021 Korea University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.