Sparse Feature Convolutional Neural Network with Cluster Max Extraction for Fast Object Classification
- Authors
- Kim, Sung Hee; Pae, Dong Sung; Kang, Tae-Koo; Kim, Dong W.; Lim, Myo Taeg
- Issue Date
- 11월-2018
- Publisher
- SPRINGER SINGAPORE PTE LTD
- Keywords
- Deep learning; Online-training control; Object recognition; Classification
- Citation
- JOURNAL OF ELECTRICAL ENGINEERING & TECHNOLOGY, v.13, no.6, pp.2468 - 2478
- Indexed
- SCIE
SCOPUS
KCI
- Journal Title
- JOURNAL OF ELECTRICAL ENGINEERING & TECHNOLOGY
- Volume
- 13
- Number
- 6
- Start Page
- 2468
- End Page
- 2478
- URI
- https://scholar.korea.ac.kr/handle/2021.sw.korea/72394
- DOI
- 10.5370/JEET.2018.13.6.2468
- ISSN
- 1975-0102
- Abstract
- We propose the Sparse Feature Convolutional Neural Network (SFCNN) to reduce the volume of convolutional neural networks (CNNs). Despite the superior classification performance of CNNs, their enormous network volume requires high computational cost and long processing time, making real-time applications such as online-training difficult. We propose an advanced network that reduces the volume of conventional CNNs by producing a region-based sparse feature map. To produce the sparse feature map, two complementary region-based value extraction methods, cluster max extraction and local value extraction, are proposed. Cluster max is selected as the main function based on experimental results. To evaluate SFCNN, we conduct an experiment with two conventional CNNs. The network trains 59 times faster and tests 81 times faster than the VGG network, with a 1.2% loss of accuracy in multi-class classification using the Caltech101 dataset. In vehicle classification using the GTI Vehicle Image Database, the network trains 88 times faster and tests 94 times faster than the conventional CNNs, with a 0.1% loss of accuracy.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Engineering > School of Electrical Engineering > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.