Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

A new approach to training more interpretable model with additional segmentation

Authors
Shin, SungukKim, YoungjoonYoon, Ji Won
Issue Date
Dec-2021
Publisher
ELSEVIER
Keywords
Classification model; Convolutional neural networks; Interpretable machine learning
Citation
PATTERN RECOGNITION LETTERS, v.152, pp.188 - 194
Indexed
SCIE
SCOPUS
Journal Title
PATTERN RECOGNITION LETTERS
Volume
152
Start Page
188
End Page
194
URI
https://scholar.korea.ac.kr/handle/2021.sw.korea/135586
DOI
10.1016/j.patrec.2021.10.003
ISSN
0167-8655
Abstract
It is not straightforward to understand how the complicated deep learning models work because they are almost black boxes. To address this problem, various approaches have been developed to provide interpretability and applied in black-box deep learning models. However, the traditional interpretable machine learning only helps us to understand the models which have already been trained. Therefore, if the models are not properly trained, it is obvious that the interpretable machine learning will not work well. We propose a simple but effective method which trains models to improve interpretability for image classification. We also evaluate how well the models focus on appropriate objects, not just relying on classification accuracy. We use Class Activation Mapping (CAM) to train and evaluate the model interpretability. As a result, with VOC PASCAL 2012 datasets, when the ResNet50 model is trained by the proposed approach the 0.5IOU is 29.61%, while the model which is trained only by images and labels is 13.00%. The classification accuracy of the proposed approach is 75.03%, the existing method is 68.38%, and FCN is 60.69%. These evaluations show that the proposed approach is effective. (c) 2021 Elsevier B.V. All rights reserved.
Files in This Item
There are no files associated with this item.
Appears in
Collections
School of Cyber Security > Department of Information Security > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Yoon, Ji Won photo

Yoon, Ji Won
Department of Information Security
Read more

Altmetrics

Total Views & Downloads

BROWSE