On the Nature of Functional Differentiation: The Role of Self-Organization with Constraints

  • Ichiro Tsuda
    Chubu University Academy of Emerging Sciences, Chubu University, Aichi, Kasugai 487-8501, Japan
  • Hiroshi Watanabe
    Center for Mathematical Science and Artificial Intelligence, Chubu University, Aichi, Kasugai 487-8501, Japan
  • Hiromichi Tsukada
    Center for Mathematical Science and Artificial Intelligence, Chubu University, Aichi, Kasugai 487-8501, Japan
  • Yutaka Yamaguti
    Faculty of Information Engineering, Fukuoka Institute of Technology, Fukuoka 811-0295, Japan

Description

<jats:p>The focus of this article is the self-organization of neural systems under constraints. In 2016, we proposed a theory for self-organization with constraints to clarify the neural mechanism of functional differentiation. As a typical application of the theory, we developed evolutionary reservoir computers that exhibit functional differentiation of neurons. Regarding the self-organized structure of neural systems, Warren McCulloch described the neural networks of the brain as being “heterarchical”, rather than hierarchical, in structure. Unlike the fixed boundary conditions in conventional self-organization theory, where stationary phenomena are the target for study, the neural networks of the brain change their functional structure via synaptic learning and neural differentiation to exhibit specific functions, thereby adapting to nonstationary environmental changes. Thus, the neural network structure is altered dynamically among possible network structures. We refer to such changes as a dynamic heterarchy. Through the dynamic changes of the network structure under constraints, such as physical, chemical, and informational factors, which act on the whole system, neural systems realize functional differentiation or functional parcellation. Based on the computation results of our model for functional differentiation, we propose hypotheses on the neuronal mechanism of functional differentiation. Finally, using the Kolmogorov–Arnold–Sprecher superposition theorem, which can be realized by a layered deep neural network, we propose a possible scenario of functional (including cell) differentiation.</jats:p>

Journal

  • Entropy

    Entropy 24 (2), 240-, 2022-02-04

    MDPI AG

References(44)*help

See more

Related Projects

See more

Details 詳細情報について

Report a problem

Back to top