-
- MEMIDA Shoko
- Tokyo Institute of Technology
-
- MIURA Satoshi
- Tokyo Institute of Technology
Bibliographic Information
- Other Title
-
- YOLACT++を用いた手術用鉗子の識別
Abstract
<p>Forceps tracking in laparoscopic surgery contributes to improved surgical outcomes. We identified forceps by deep learning. Since it is important to identify forceps in real-time, we selected YOLACT++ for fast and accurate segmentation and verified whether the detection speed can be maintained in the video. We annotated a total of 2537 images combining multiple datasets including various surgical environments and divided them into training, validation, and test data at a ratio of approximately 8:1:1. The training was conducted with a batch size of 32, iterations of 100106, and epochs of 1588, and the results showed that the forceps identification speed was 25.79 fps and accuracy was 84.31 %. The results of the test using the trained model with this hyperparameter showed that the forceps identification speed was 28.01 fps and accuracy was 71.42 % for images, and the forceps identification speed was 17.70 fps for the video.</p>
Journal
-
- The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec)
-
The Proceedings of JSME annual Conference on Robotics and Mechatronics (Robomec) 2023 (0), 2P1-B23-, 2023
The Japan Society of Mechanical Engineers
- Tweet
Keywords
Details 詳細情報について
-
- CRID
- 1390017444754838528
-
- ISSN
- 24243124
-
- Text Lang
- ja
-
- Data Source
-
- JaLC
- Crossref
-
- Abstract License Flag
- Disallowed