Theoretical Learning Speed Evaluation of Parallel Back Propagation Algorithms

  • YAMAMORI Kunihito
    Graduate School of Information Science, Japan Advanced Institute of Science and Technology
  • HORIGUCHI Susumu
    Graduate School of Information Science, Japan Advanced Institute of Science and Technology

Bibliographic Information

Other Title
  • 並列誤差逆伝搬学習法の解析的な学習時間評価

Search this article

Description

Multilayer neural network with back-propagation learning requires enormous computation times for large scale problems. To reduce computation times, several parallel learning, algorithms have been proposed. However, most of parallel algorithms were specified for particular parallel computers. Performance analyses of parallel back-propagation algorithms have not been investigated Sufficiently. This paper addresses the theoretical performances of parallel back-propagation algorithms. We classify parallel back-propagation algorithms into three models; unit parallel model, learning-set parallel model and pass parallel model. Then, their parallel performances are analyzed theoretically. To confirm theoretical performance estimations, these parallel models are implemented on parallel computer nCUBE/2. It is seen that the learning-set parallel model is most suitable for parallel computers by theoretical analyses and experimental results.

Journal

  • IPSJ SIG Notes

    IPSJ SIG Notes 68 57-62, 1997-10-17

    Information Processing Society of Japan (IPSJ)

References(6)*help

See more

Details 詳細情報について

  • CRID
    1571980077129133312
  • NII Article ID
    110002932058
  • NII Book ID
    AN10463942
  • Text Lang
    ja
  • Data Source
    • CiNii Articles

Report a problem

Back to top