Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Attentional feature pyramid network for small object detection

Full metadata record
DC Field Value Language
dc.contributor.authorMin, Kyungseo-
dc.contributor.authorLee, Gun-Hee-
dc.contributor.authorLee, Seong-Whan-
dc.date.accessioned2022-12-08T20:41:49Z-
dc.date.available2022-12-08T20:41:49Z-
dc.date.created2022-12-08-
dc.date.issued2022-11-
dc.identifier.issn0893-6080-
dc.identifier.urihttps://scholar.korea.ac.kr/handle/2021.sw.korea/146516-
dc.description.abstractRecent state-of-the-art detectors generally exploit the Feature Pyramid Networks (FPN) due to its advantage of detecting objects at different scales. Despite significant advances in object detection owing to the design of feature pyramids, it is still challenging to detect small objects with low resolution and dense distribution in complex scenes. To address these problems, we propose Atten-tional Feature Pyramid Network, a new feature pyramid architecture named AFPN which consists of three components to enhance the small object detection ability, specifically: Dynamic Texture Attention, Foreground-Aware Co-Attention, and Detail Context Attention. First, Dynamic Texture Attention augments the texture features dynamically by filtering out redundant semantics to highlight small objects in lower layers and amplifying credible details to emphasize large objects in higher layers. Then, Foreground-Aware Co-Attention is explored to detect densely arranged small objects by enhancing the objects feature via foreground-correlated contexts and suppressing the background noise. Finally, to better capture the features of small objects, Detail Context Attention adaptively aggregates detail cues of RoI features with different scales for a more accurate feature representation. By substituting FPN with AFPN in Faster R-CNN, our method performs on par with the state-of-the-art performance on Tsinghua-Tencent 100K. Furthermore, we achieve highly competitive results on small category of both PASCAL VOC and MS COCO. (c) 2022 Elsevier Ltd. All rights reserved.-
dc.languageEnglish-
dc.language.isoen-
dc.publisherPERGAMON-ELSEVIER SCIENCE LTD-
dc.subjectRECOGNITION-
dc.titleAttentional feature pyramid network for small object detection-
dc.typeArticle-
dc.contributor.affiliatedAuthorLee, Seong-Whan-
dc.identifier.doi10.1016/j.neunet.2022.08.029-
dc.identifier.scopusid2-s2.0-85138203026-
dc.identifier.wosid000865423600013-
dc.identifier.bibliographicCitationNEURAL NETWORKS, v.155, pp.439 - 450-
dc.relation.isPartOfNEURAL NETWORKS-
dc.citation.titleNEURAL NETWORKS-
dc.citation.volume155-
dc.citation.startPage439-
dc.citation.endPage450-
dc.type.rimsART-
dc.type.docTypeArticle-
dc.description.journalClass1-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaNeurosciences & Neurology-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.relation.journalWebOfScienceCategoryNeurosciences-
dc.subject.keywordPlusRECOGNITION-
dc.subject.keywordAuthorObject detection-
dc.subject.keywordAuthorSmall object detection-
dc.subject.keywordAuthorFeature pyramid network-
dc.subject.keywordAuthorAttention mechanism-
Files in This Item
There are no files associated with this item.
Appears in
Collections
Graduate School > Department of Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lee, Seong Whan photo

Lee, Seong Whan
인공지능학과
Read more

Altmetrics

Total Views & Downloads

BROWSE