書誌事項
- タイトル別名
-
- CNN-based End-to-End Learning for Reaching Movement of Robotic Arm
説明
<p>For factory automation, robots are required to substitute for workers. For this issue, researchers have paid attention to humanoid and dual-arm robots. In these researches, autonomous generation of motions for target objects is a challenge. In this paper, we focus on the reaching movement. For this challenge, we propose an end-to-end motion planner based on the convolutional neural network, CNN. In contrast to other related motion planners, we use only right and left images captured from the head camera of robot. The network training is executed by learning from demonstration. This allows the robot to map relationship between the images (input) and joint angles (output). Through the experiments, we show that the robot is enabled to generate the reaching movement toward objects located not only at the demonstrated positions, but also at other unknown position.</p>
収録刊行物
-
- ロボティクス・メカトロニクス講演会講演概要集
-
ロボティクス・メカトロニクス講演会講演概要集 2019 (0), 1P2-C01-, 2019
一般社団法人 日本機械学会
- Tweet
詳細情報 詳細情報について
-
- CRID
- 1390565134811104000
-
- NII論文ID
- 130007774362
-
- ISSN
- 24243124
-
- 本文言語コード
- ja
-
- データソース種別
-
- JaLC
- Crossref
- CiNii Articles
- OpenAIRE
-
- 抄録ライセンスフラグ
- 使用不可