Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Hierarchical multi-atlas label fusion with multi-scale feature representation and label-specific patch partition

Full metadata record
DC Field Value Language
dc.contributor.authorWu, Guorong-
dc.contributor.authorKim, Minjeong-
dc.contributor.authorSanroma, Gerard-
dc.contributor.authorWang, Qian-
dc.contributor.authorMunsell, Brent C.-
dc.contributor.authorShen, Dinggang-
dc.date.accessioned2021-09-04T19:27:46Z-
dc.date.available2021-09-04T19:27:46Z-
dc.date.created2021-06-15-
dc.date.issued2015-02-01-
dc.identifier.issn1053-8119-
dc.identifier.urihttps://scholar.korea.ac.kr/handle/2021.sw.korea/94442-
dc.description.abstractMulti-atlas patch-based label fusion methods have been successfully used to improve segmentation accuracy in many important medical image analysis applications. In general, to achieve label fusion a single target image is first registered to several atlas images. After registration a label is assigned to each target point in the target image by determining the similarity between the underlying target image patch (centered at the target point) and the aligned image patch in each atlas image. To achieve the highest level of accuracy during the label fusion process it's critical for the chosen patch similarity measurement to accurately capture the tissue/shape appearance of the anatomical structure. One major limitation of existing state-of-the-art label fusion methods is that they often apply a fixed size image patch throughout the entire label fusion procedure. Doing so may severely affect the fidelity of the patch similarity measurement, which in turn may not adequately capture complex tissue appearance patterns expressed by the anatomical structure. To address this limitation, we advance state-of-the-art by adding three new label fusion contributions: First, each image patch is now characterized by a multi-scale feature representation that encodes both local and semi-local image information. Doing so will increase the accuracy of the patch-based similarity measurement. Second, to limit the possibility of the patch-based similarity measurement being wrongly guided by the presence of multiple anatomical structures in the same image patch, each atlas image patch is further partitioned into a set of label-specific partial image patches according to the existing labels. Since image information has now been semantically divided into different patterns, these new label-specific atlas patches make the label fusion process more specific and flexible. Lastly, in order to correct target points that are mislabeled during label fusion, a hierarchical approach is used to improve the label fusion results. In particular, a coarse-to-fine iterative label fusion approach is used that gradually reduces the patch size. To evaluate the accuracy of our label fusion approach, the proposed method was used to segment the hippocampus in the ADNI dataset and 7.0 T MR images, sub-cortical regions in LONI LBPA40 dataset, mid-brain regions in SATA dataset from MICCAI 2013 segmentation challenge, and a set of key internal gray matter structures in IXI dataset. In all experiments, the segmentation results of the proposed hierarchical label fusion method with multi-scale feature representations and label-specific atlas patches are more accurate than several well-known state-of-the-art label fusion methods. (C) 2014 Elsevier Inc. All rights reserved.-
dc.languageEnglish-
dc.language.isoen-
dc.publisherACADEMIC PRESS INC ELSEVIER SCIENCE-
dc.subjectPROBABILISTIC ATLAS-
dc.subjectSEGMENTATION-
dc.subjectBRAIN-
dc.subjectHIPPOCAMPUS-
dc.subjectCLASSIFICATION-
dc.subjectPERFORMANCE-
dc.subjectSELECTION-
dc.subjectIMAGES-
dc.subjectMODEL-
dc.subjectTRUTH-
dc.titleHierarchical multi-atlas label fusion with multi-scale feature representation and label-specific patch partition-
dc.typeArticle-
dc.contributor.affiliatedAuthorShen, Dinggang-
dc.identifier.doi10.1016/j.neuroimage.2014.11.025-
dc.identifier.scopusid2-s2.0-84912106417-
dc.identifier.wosid000347101900004-
dc.identifier.bibliographicCitationNEUROIMAGE, v.106, pp.34 - 46-
dc.relation.isPartOfNEUROIMAGE-
dc.citation.titleNEUROIMAGE-
dc.citation.volume106-
dc.citation.startPage34-
dc.citation.endPage46-
dc.type.rimsART-
dc.type.docTypeArticle-
dc.description.journalClass1-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaNeurosciences & Neurology-
dc.relation.journalResearchAreaRadiology, Nuclear Medicine & Medical Imaging-
dc.relation.journalWebOfScienceCategoryNeurosciences-
dc.relation.journalWebOfScienceCategoryNeuroimaging-
dc.relation.journalWebOfScienceCategoryRadiology, Nuclear Medicine & Medical Imaging-
dc.subject.keywordPlusPROBABILISTIC ATLAS-
dc.subject.keywordPlusSEGMENTATION-
dc.subject.keywordPlusBRAIN-
dc.subject.keywordPlusHIPPOCAMPUS-
dc.subject.keywordPlusCLASSIFICATION-
dc.subject.keywordPlusPERFORMANCE-
dc.subject.keywordPlusSELECTION-
dc.subject.keywordPlusIMAGES-
dc.subject.keywordPlusMODEL-
dc.subject.keywordPlusTRUTH-
dc.subject.keywordAuthorPatch-based labeling-
dc.subject.keywordAuthorMulti-atlas based segmentation-
dc.subject.keywordAuthorMulti-scale feature representation-
dc.subject.keywordAuthorLabel-specific patch partition-
dc.subject.keywordAuthorSparse representation-
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School > Department of Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE