Model Size Constrained Optimization of DARTS in Neural Architecture Search

DOI

Bibliographic Information

Other Title
  • Neural Architecture SearchにおけるDARTSのモデルサイズ制約付き最適化

Description

<p>Deep learning, a machine learning method, has been applied in a variety of fields such as natural language processing and image recognition due to its high performance. AutoML, a method to automate machine learning, has been widely studied, and Neural Architecture Search (NAS), a method to automatically optimize neural network models according to data and objectives, plays a very important role. NAS can find a very accurate model using DARTS, which uses the gradient method to search. Generally, DARTS only optimizes accuracy, which improves recognition accuracy but also increases the amount of memory needed for the model. However, there is a limit to the quantity of memory that can be loaded when using deep learning on mobile devices and embedded systems. In this paper, we propose a method to search for a network model that considers accuracy and model size by adding constraints to DARTS. As a result, the proposed method enables us to search for network models with high accuracy in constraint conditions.</p>

Journal

Details 詳細情報について

Report a problem

Back to top