- 【Updated on May 12, 2025】 Integration of CiNii Dissertations and CiNii Books into CiNii Research
- Trial version of CiNii Research Knowledge Graph Search feature is available on CiNii Labs
- 【Updated on June 30, 2025】Suspension and deletion of data provided by Nikkei BP
- Regarding the recording of “Research Data” and “Evidence Data”
Spatially heterogeneous learning by a deep student machine
Description
Deep neural networks (DNN) with a huge number of adjustable parameters remain largely black boxes. To shed light on the hidden layers of DNN, we study supervised learning by a DNN of width $N$ and depth $L$ consisting of $NL$ perceptrons with $c$ inputs by a statistical mechanics approach called the teacher-student setting. We consider an ensemble of student machines that exactly reproduce $M$ sets of $N$ dimensional input/output relations provided by a teacher machine. We show that the problem becomes exactly solvable in what we call as 'dense limit': $N \gg c \gg 1$ and $M \gg 1$ with fixed $α=M/c$ using the replica method developed in (H. Yoshino, (2020)). We also study the model numerically performing simple greedy MC simulations. Simulations reveal that learning by the DNN is quite heterogeneous in the network space: configurations of the teacher and the student machines are more correlated within the layers closer to the input/output boundaries while the central region remains much less correlated due to the over-parametrization in qualitative agreement with the theoretical prediction. We evaluate the generalization-error of the DNN with various depth $L$ both theoretically and numerically. Remarkably both the theory and simulation suggest generalization-ability of the student machines, which are only weakly correlated with the teacher in the center, does not vanish even in the deep limit $L \gg 1$ where the system becomes heavily over-parametrized. We also consider the impact of effective dimension $D(\leq N)$ of data by incorporating the hidden manifold model (S. Goldt et. al., (2020)) into our model. The theory implies that the loop corrections to the dense limit become enhanced by either decreasing the width $N$ or decreasing the effective dimension $D$ of the data. Simulation suggests both lead to significant improvements in generalization-ability.
41 page, 19 figures (revised version)
Journal
-
- Physical Review Research
-
Physical Review Research 5 (3), 2023-07-31
American Physical Society (APS)
- Tweet
Keywords
- FOS: Computer and information sciences
- Computer Science - Machine Learning
- Statistical Mechanics (cond-mat.stat-mech)
- Physics
- QC1-999
- FOS: Physical sciences
- Machine Learning (stat.ML)
- Disordered Systems and Neural Networks (cond-mat.dis-nn)
- Condensed Matter - Disordered Systems and Neural Networks
- Machine Learning (cs.LG)
- Statistics - Machine Learning
- Condensed Matter - Statistical Mechanics
Details 詳細情報について
-
- CRID
- 1360021390572357376
-
- ISSN
- 26431564
-
- Article Type
- journal article
-
- Data Source
-
- Crossref
- KAKEN
- OpenAIRE