Next Article in Journal / Special Issue
Testing Segmentation Popular Loss and Variations in Three Multiclass Medical Imaging Problems
Previous Article in Journal
Acknowledgment to Reviewers of Journal of Imaging in 2020
Previous Article in Special Issue
Bayesian Learning of Shifted-Scaled Dirichlet Mixture Models and Its Application to Early COVID-19 Detection in Chest X-ray Images
Article

Hand Motion-Aware Surgical Tool Localization and Classification from an Egocentric Camera

1
Faculty of Science and Technology, Keio University, Yokohama, Kanagawa 223-8852, Japan
2
Keio University School of Medicine, Shinjuku-ku 160-8582, Tokyo, Japan
*
Author to whom correspondence should be addressed.
Academic Editor: Yudong Zhang
Received: 23 December 2020 / Revised: 15 January 2021 / Accepted: 18 January 2021 / Published: 25 January 2021
(This article belongs to the Special Issue Deep Learning in Medical Image Analysis)
Detecting surgical tools is an essential task for the analysis and evaluation of surgical videos. However, in open surgery such as plastic surgery, it is difficult to detect them because there are surgical tools with similar shapes, such as scissors and needle holders. Unlike endoscopic surgery, the tips of the tools are often hidden in the operating field and are not captured clearly due to low camera resolution, whereas the movements of the tools and hands can be captured. As a result that the different uses of each tool require different hand movements, it is possible to use hand movement data to classify the two types of tools. We combined three modules for localization, selection, and classification, for the detection of the two tools. In the localization module, we employed the Faster R-CNN to detect surgical tools and target hands, and in the classification module, we extracted hand movement information by combining ResNet-18 and LSTM to classify two tools. We created a dataset in which seven different types of open surgery were recorded, and we provided the annotation of surgical tool detection. Our experiments show that our approach successfully detected the two different tools and outperformed the two baseline methods. View Full-Text
Keywords: object detection; surgical tools; open surgery; egocentric camera object detection; surgical tools; open surgery; egocentric camera
Show Figures

Figure 1

MDPI and ACS Style

Shimizu, T.; Hachiuma, R.; Kajita, H.; Takatsume, Y.; Saito, H. Hand Motion-Aware Surgical Tool Localization and Classification from an Egocentric Camera. J. Imaging 2021, 7, 15. https://0-doi-org.brum.beds.ac.uk/10.3390/jimaging7020015

AMA Style

Shimizu T, Hachiuma R, Kajita H, Takatsume Y, Saito H. Hand Motion-Aware Surgical Tool Localization and Classification from an Egocentric Camera. Journal of Imaging. 2021; 7(2):15. https://0-doi-org.brum.beds.ac.uk/10.3390/jimaging7020015

Chicago/Turabian Style

Shimizu, Tomohiro; Hachiuma, Ryo; Kajita, Hiroki; Takatsume, Yoshifumi; Saito, Hideo. 2021. "Hand Motion-Aware Surgical Tool Localization and Classification from an Egocentric Camera" J. Imaging 7, no. 2: 15. https://0-doi-org.brum.beds.ac.uk/10.3390/jimaging7020015

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop