書誌事項
- タイトル別名
-
- <i>L</i><sub>2 </sub>Induced Norm Analysis for Nonnegative Input Signals and Its Application to Stability Analysis of Recurrent Neural Networks
説明
<p>A recurrent neural network (RNN) is a class of deep neural networks and able to imitate the behavior of dynamical systems due to its feedback mechanism. However, the feedback mechanism may cause network instability and hence the stability analysis of RNNs has been an important issue. From control theoretic viewpoint, we can readily apply the small gain theorem for the stability analysis of an RNN by representing it as a feedback connection with a linear time-invariant (LTI) system and a static nonlinear activation function typically being a rectified linear unit (ReLU). It is nonetheless true that the standard small gain theorem leads to conservative results since it does not care the important property that the ReLU returns nonnegative signals only. This motivates us to analyze the L2 induced norm of LTI systems for nonnegative input signals, which is referred to the L2+ induced norm in this paper. We characterize an upper bound of the L2+ induced norm by copositive programming, and then derive a numerically tractable semidefinite programming problem for (loosened) upper bound computation. We finally derive an L2+-induced-norm-based small gain theorem for the stability analysis of RNNs and illustrate its effectiveness by numerical examples.</p>
収録刊行物
-
- システム制御情報学会論文誌
-
システム制御情報学会論文誌 35 (2), 29-37, 2022-02-15
一般社団法人 システム制御情報学会
- Tweet
キーワード
詳細情報 詳細情報について
-
- CRID
- 1390855035389831424
-
- ISSN
- 2185811X
- 13425668
-
- 本文言語コード
- ja
-
- データソース種別
-
- JaLC
- Crossref
- OpenAIRE
-
- 抄録ライセンスフラグ
- 使用不可