Experimental Investigation of Neural Network with Deep Residual Block

Bibliographic Information

Other Title
  • 深いResidual blockをもったニューラルネットワークの実験的検討

Search this article

Description

Recently, convolutional neural network has commonly used for classification task. Many conventional methods employ residual network architecture that repeatedly stack a module called residual block. In this paper, we propose a new module architecture to enhance representational power of modules. The module utilizes a method used in DenseNet to make architecture deeper. Each layer in the module is connected to every other layer in a feed-forward fashion. Our experiments conducted with CIFAR10 and CIFAR100 show that our method outperforms conventional methods in term of parameter efficiency for error rate.

Journal

Details 詳細情報について

Report a problem

Back to top