Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

B-HMAX: A fast binary biologically inspired model for object recognition

Full metadata record
DC Field Value Language
dc.contributor.authorZhang, Hua-Zhen-
dc.contributor.authorLu, Yan-Feng-
dc.contributor.authorKang, Tae-Koo-
dc.contributor.authorLim, Myo-Taeg-
dc.date.accessioned2021-09-03T15:45:13Z-
dc.date.available2021-09-03T15:45:13Z-
dc.date.created2021-06-16-
dc.date.issued2016-12-19-
dc.identifier.issn0925-2312-
dc.identifier.urihttps://scholar.korea.ac.kr/handle/2021.sw.korea/86522-
dc.description.abstractThe biologically inspired model, Hierarchical Model and X (HMAX), has excellent performance in object categorization. It consists of four layers of computational units based on the mechanisms of the visual cortex. However, the random patch selection method in HMAX often leads to mismatch due to the extraction of redundant information, and the computational cost of recognition is expensive because of the Euclidean distance calculations for similarity in the third layer, S2. To solve these limitations, we propose a fast binary-based HMAX model (B-HMAX). In the proposed method, we detect corner-based interest points after the second layer, C1, to extract few features with better distinctiveness, use binary strings to describe the image patches extracted around detected corners, then use the Hamming distance for matching between two patches in the third layer, S2, which is much faster than Euclidean distance calculations. The experimental results demonstrate that our proposed B-HMAX model can significantly reduce the total process time by almost 80% for an image, while keeping the accuracy performance competitive with the standard HMAX. (C) 2016 Elsevier B.V. All rights reserved.-
dc.languageEnglish-
dc.language.isoen-
dc.publisherELSEVIER-
dc.subjectLOCAL FEATURES-
dc.subjectCLASSIFICATION-
dc.subjectTEXTURE-
dc.titleB-HMAX: A fast binary biologically inspired model for object recognition-
dc.typeArticle-
dc.contributor.affiliatedAuthorLim, Myo-Taeg-
dc.identifier.doi10.1016/j.neucom.2016.08.051-
dc.identifier.scopusid2-s2.0-84994130278-
dc.identifier.wosid000388053700026-
dc.identifier.bibliographicCitationNEUROCOMPUTING, v.218, pp.242 - 250-
dc.relation.isPartOfNEUROCOMPUTING-
dc.citation.titleNEUROCOMPUTING-
dc.citation.volume218-
dc.citation.startPage242-
dc.citation.endPage250-
dc.type.rimsART-
dc.type.docTypeArticle-
dc.description.journalClass1-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.subject.keywordPlusLOCAL FEATURES-
dc.subject.keywordPlusCLASSIFICATION-
dc.subject.keywordPlusTEXTURE-
dc.subject.keywordAuthorObject recognition-
dc.subject.keywordAuthorClassification-
dc.subject.keywordAuthorHMAX-
dc.subject.keywordAuthorBinary descriptor-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Engineering > School of Electrical Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lim, Myo taeg photo

Lim, Myo taeg
공과대학 (전기전자공학부)
Read more

Altmetrics

Total Views & Downloads

BROWSE