Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Shallow Fully Connected Neural Network Training by Forcing Linearization into Valid Region and Balancing Training Ratesopen access

Authors
Heo, Jea PilIm, Chang GyuRyu, Kyung HwanSung, Su WhanYoo, ChangkyooYang, Dae Ryook
Issue Date
6월-2022
Publisher
MDPI
Keywords
neural network; training rule; local linearization; optimal solution; pH system modeling
Citation
PROCESSES, v.10, no.6
Indexed
SCIE
SCOPUS
Journal Title
PROCESSES
Volume
10
Number
6
URI
https://scholar.korea.ac.kr/handle/2021.sw.korea/142803
DOI
10.3390/pr10061157
ISSN
2227-9717
Abstract
A new supervisory training rule for a shallow fully connected neural network (SFCNN) is proposed in this present study. The proposed training rule is developed based on local linearization and analytical optimal solutions for linearized SFCNN. The cause of nonlinearity in neural network training is analyzed, and it is removed by local linearization. The optimal solution for the linearized SFCNN, which minimizes the cost function for the training, is analytically derived. Additionally, the training efficiency and model accuracy of the trained SFCNN are improved by keeping estimates within a valid range of the linearization. The superiority of the proposed approach is demonstrated by applying the proposed training rule to the modeling of a typical nonlinear pH process, Boston housing prices dataset, and automobile mileage per gallon dataset. The proposed training rule shows the smallest modeling error and the smallest iteration number required for convergence compared with several previous approaches from the literature for the case study.
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Engineering > Department of Chemical and Biological Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE