- 【Updated on May 12, 2025】 Integration of CiNii Dissertations and CiNii Books into CiNii Research
- Trial version of CiNii Research Automatic Translation feature is available on CiNii Labs
- Suspension and deletion of data provided by Nikkei BP
- Regarding the recording of “Research Data” and “Evidence Data”
Regularization of hidden layer unit response for neural networks
Description
In this paper, we looked into two issues in pattern recognition using neural networks trained by back propagation (BP), namely inefficient learning and insufficient generalization. We observed that these phenomena are partly caused by the way the hidden layer units responds to the inputs. In order to solve the issues, we introduced regularization of the hidden layer unit response which amounts to suppressing the correlation among the response of the hidden layer units, and pruning the unit with the method unit fusion. The results of using the proposed technique were compared with the case of conventional technique in pattern recognition problems. From the results of the experiments, the rate of correct recognition increased when using regularization in the hidden layer unit response is performed, and it turned out that the required number of training epochs also decreases.
Journal
-
- 2003 IEEE Pacific Rim Conference on Communications Computers and Signal Processing (PACRIM 2003) (Cat. No.03CH37490)
-
2003 IEEE Pacific Rim Conference on Communications Computers and Signal Processing (PACRIM 2003) (Cat. No.03CH37490) 1 348-351, 2004-06-03
IEEE