Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Sparse Feature Convolutional Neural Network with Cluster Max Extraction for Fast Object Classification

Full metadata record
DC Field Value Language
dc.contributor.authorKim, Sung Hee-
dc.contributor.authorPae, Dong Sung-
dc.contributor.authorKang, Tae-Koo-
dc.contributor.authorKim, Dong W.-
dc.contributor.authorLim, Myo Taeg-
dc.date.accessioned2021-09-02T04:55:48Z-
dc.date.available2021-09-02T04:55:48Z-
dc.date.created2021-06-18-
dc.date.issued2018-11-
dc.identifier.issn1975-0102-
dc.identifier.urihttps://scholar.korea.ac.kr/handle/2021.sw.korea/72394-
dc.description.abstractWe propose the Sparse Feature Convolutional Neural Network (SFCNN) to reduce the volume of convolutional neural networks (CNNs). Despite the superior classification performance of CNNs, their enormous network volume requires high computational cost and long processing time, making real-time applications such as online-training difficult. We propose an advanced network that reduces the volume of conventional CNNs by producing a region-based sparse feature map. To produce the sparse feature map, two complementary region-based value extraction methods, cluster max extraction and local value extraction, are proposed. Cluster max is selected as the main function based on experimental results. To evaluate SFCNN, we conduct an experiment with two conventional CNNs. The network trains 59 times faster and tests 81 times faster than the VGG network, with a 1.2% loss of accuracy in multi-class classification using the Caltech101 dataset. In vehicle classification using the GTI Vehicle Image Database, the network trains 88 times faster and tests 94 times faster than the conventional CNNs, with a 0.1% loss of accuracy.-
dc.languageEnglish-
dc.language.isoen-
dc.publisherSPRINGER SINGAPORE PTE LTD-
dc.subjectRECOGNITION-
dc.titleSparse Feature Convolutional Neural Network with Cluster Max Extraction for Fast Object Classification-
dc.typeArticle-
dc.contributor.affiliatedAuthorLim, Myo Taeg-
dc.identifier.doi10.5370/JEET.2018.13.6.2468-
dc.identifier.scopusid2-s2.0-85055627228-
dc.identifier.wosid000447673000037-
dc.identifier.bibliographicCitationJOURNAL OF ELECTRICAL ENGINEERING & TECHNOLOGY, v.13, no.6, pp.2468 - 2478-
dc.relation.isPartOfJOURNAL OF ELECTRICAL ENGINEERING & TECHNOLOGY-
dc.citation.titleJOURNAL OF ELECTRICAL ENGINEERING & TECHNOLOGY-
dc.citation.volume13-
dc.citation.number6-
dc.citation.startPage2468-
dc.citation.endPage2478-
dc.type.rimsART-
dc.type.docTypeArticle-
dc.identifier.kciidART002402287-
dc.description.journalClass1-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.description.journalRegisteredClasskci-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.subject.keywordPlusRECOGNITION-
dc.subject.keywordAuthorDeep learning-
dc.subject.keywordAuthorOnline-training control-
dc.subject.keywordAuthorObject recognition-
dc.subject.keywordAuthorClassification-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Engineering > School of Electrical Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lim, Myo taeg photo

Lim, Myo taeg
공과대학 (전기전자공학부)
Read more

Altmetrics

Total Views & Downloads

BROWSE