Multiatlas-Based Segmentation Editing With Interaction-Guided Patch Selection and Label Fusion
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Park, Sang Hyun | - |
dc.contributor.author | Gao, Yaozong | - |
dc.contributor.author | Shen, Dinggang | - |
dc.date.accessioned | 2021-09-03T23:20:06Z | - |
dc.date.available | 2021-09-03T23:20:06Z | - |
dc.date.created | 2021-06-18 | - |
dc.date.issued | 2016-06 | - |
dc.identifier.issn | 0018-9294 | - |
dc.identifier.uri | https://scholar.korea.ac.kr/handle/2021.sw.korea/88478 | - |
dc.description.abstract | We propose a novel multiatlas-based segmentation method to address the segmentation editing scenario, where an incomplete segmentation is given along with a set of existing reference label images (used as atlases). Unlike previous multiatlas-based methods, which depend solely on appearance features, we incorporate interaction-guided constraints to find appropriate atlas label patches in the reference label set and derive their weights for label fusion. Specifically, user interactions provided on the erroneous parts are first divided into multiple local combinations. For each combination, the atlas label patches well-matched with both interactions and the previous segmentation are identified. Then, the segmentation is updated through the voxelwise label fusion of selected atlas label patches with their weights derived from the distances of each underlying voxel to the interactions. Since the atlas label patches well-matched with different local combinations are used in the fusion step, our method can consider various local shape variations during the segmentation update, even with only limited atlas label images and user interactions. Besides, since our method does not depend on either image appearance or sophisticated learning steps, it can be easily applied to general editing problems. To demonstrate the generality of our method, we apply it to editing segmentations of CT prostate, CT brainstem, and MR hippocampus, respectively. Experimental results show that our method outperforms existing editing methods in all three datasets. | - |
dc.language | English | - |
dc.language.iso | en | - |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | - |
dc.subject | IMAGE SEGMENTATION | - |
dc.subject | MR-IMAGES | - |
dc.subject | PROSTATE SEGMENTATION | - |
dc.subject | REGISTRATION | - |
dc.subject | HIPPOCAMPUS | - |
dc.subject | EFFICIENT | - |
dc.title | Multiatlas-Based Segmentation Editing With Interaction-Guided Patch Selection and Label Fusion | - |
dc.type | Article | - |
dc.contributor.affiliatedAuthor | Shen, Dinggang | - |
dc.identifier.doi | 10.1109/TBME.2015.2491612 | - |
dc.identifier.scopusid | 2-s2.0-84976420232 | - |
dc.identifier.wosid | 000377045500013 | - |
dc.identifier.bibliographicCitation | IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, v.63, no.6, pp.1208 - 1219 | - |
dc.relation.isPartOf | IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING | - |
dc.citation.title | IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING | - |
dc.citation.volume | 63 | - |
dc.citation.number | 6 | - |
dc.citation.startPage | 1208 | - |
dc.citation.endPage | 1219 | - |
dc.type.rims | ART | - |
dc.type.docType | Article | - |
dc.description.journalClass | 1 | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.relation.journalResearchArea | Engineering | - |
dc.relation.journalWebOfScienceCategory | Engineering, Biomedical | - |
dc.subject.keywordPlus | IMAGE SEGMENTATION | - |
dc.subject.keywordPlus | MR-IMAGES | - |
dc.subject.keywordPlus | PROSTATE SEGMENTATION | - |
dc.subject.keywordPlus | REGISTRATION | - |
dc.subject.keywordPlus | HIPPOCAMPUS | - |
dc.subject.keywordPlus | EFFICIENT | - |
dc.subject.keywordAuthor | Distance-based voting | - |
dc.subject.keywordAuthor | interaction-guided editing | - |
dc.subject.keywordAuthor | label fusion | - |
dc.subject.keywordAuthor | segmentation editing | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
(02841) 서울특별시 성북구 안암로 14502-3290-1114
COPYRIGHT © 2021 Korea University. All Rights Reserved.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.