sensors-logo

Journal Browser

Journal Browser

Infra-Operative Imaging, Sensing and Augmented Reality in Image-Guided Surgery

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensing and Imaging".

Deadline for manuscript submissions: closed (15 October 2022) | Viewed by 12094

Special Issue Editors


E-Mail Website
Guest Editor
Institute of Medical Robotics, School of Biomedical Engineering, Shanghai Jiao Tong University, Shanghai, China
Interests: computer-aided surgery; multimode image segmentation; registration and reconstruction; computer-aided diagnosis; machine learning; deep learning
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Beijing Advanced Innovation Center for Biomedical Engineering, School of Biological Science and Medical Engineering, Beihang University, Beijing, China
Interests: medical apparatus and instruments; medical robots and computer-assisted surgery; rehabilitation engineering

E-Mail Website
Guest Editor
Institute of Biomedical Manufacturing, School of Mechanical Engineering Shanghai Jiao Tong University, Shanghai, China
Interests: computer assisted surgery; medical image analysis; surgical navigation; VR/AR/MR technology in medicine; surgical robotics

Special Issue Information

Dear Colleagues,

Image-guided surgery (IGS) has gained widespread acceptance in various clinical practices. Although the core concepts of IGS have not changed, advances in image guidance technology, including the incorporation of intra-operative imaging, sensing and augmented reality, have the potential to enhance the safety of surgery, improve the accuracy of interventions, and aid in more complete surgery, with improved outcomes. In the past few years, significant progresses have been achieved in developing novel intra-operative imaging means, in incorporating smart sensing technology and in the introduction of new augmented reality display.

This Special Issue is open to research and review papers in this field, including, but not limited to:

• Intra-operative imaging
• Robotic imaging
• Surgical imaging and vision
• Intra-operative sensing, perception and recognition
• Augmented reality in IGS
• Multi-modal image fusion and visualization

Prof. Dr. Guoyan Zheng
Prof. Dr. Wenyong Liu
Prof. Dr. Xiaojun Chen
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

16 pages, 4599 KiB  
Article
An Image Information-Based Objective Assessment Method of Technical Manipulation Skills for Intravascular Interventions
by Jin Guo, Maoxun Li, Yue Wang and Shuxiang Guo
Sensors 2023, 23(8), 4031; https://0-doi-org.brum.beds.ac.uk/10.3390/s23084031 - 16 Apr 2023
Cited by 2 | Viewed by 1276
Abstract
The clinical success of vascular interventional surgery relies heavily on a surgeon’s catheter/guidewire manipulation skills and strategies. An objective and accurate assessment method plays a critical role in evaluating the surgeon’s technical manipulation skill level. Most of the existing evaluation methods incorporate the [...] Read more.
The clinical success of vascular interventional surgery relies heavily on a surgeon’s catheter/guidewire manipulation skills and strategies. An objective and accurate assessment method plays a critical role in evaluating the surgeon’s technical manipulation skill level. Most of the existing evaluation methods incorporate the use of information technology to find more objective assessment models based on various metrics. However, in these models, sensors are often attached to the surgeon’s hands or to interventional devices for data collection, which constrains the surgeon’s operational movements or exerts an influence on the motion trajectory of interventional devices. In this paper, an image information-based assessment method is proposed for the evaluation of the surgeon’s manipulation skills without the requirement of attaching sensors to the surgeon or catheters/guidewires. Surgeons are allowed to use their natural bedside manipulation skills during the data collection process. Their manipulation features during different catheterization tasks are derived from the motion analysis of the catheter/guidewire in video sequences. Notably, data relating to the number of speed peaks, slope variations, and the number of collisions are included in the assessment. Furthermore, the contact forces, resulting from interactions between the catheter/guidewire and the vascular model, are sensed by a 6-DoF F/T sensor. A support vector machine (SVM) classification framework is developed to discriminate the surgeon’s catheterization skill levels. The experimental results demonstrate that the proposed SVM-based assessment method can obtain an accuracy of 97.02% to distinguish between the expert and novice manipulations, which is higher than that of other existing research achievements. The proposed method has great potential to facilitate skill assessment and training of novice surgeons in vascular interventional surgery. Full article
Show Figures

Figure 1

17 pages, 16400 KiB  
Article
Ultrasound Probe and Hand-Eye Calibrations for Robot-Assisted Needle Biopsy
by Jihao Liu, Wenyuan Sun, Yuyun Zhao and Guoyan Zheng
Sensors 2022, 22(23), 9465; https://0-doi-org.brum.beds.ac.uk/10.3390/s22239465 - 03 Dec 2022
Cited by 2 | Viewed by 1956
Abstract
In robot-assisted ultrasound-guided needle biopsy, it is essential to conduct calibration of the ultrasound probe and to perform hand-eye calibration of the robot in order to establish a link between intra-operatively acquired ultrasound images and robot-assisted needle insertion. Based on a high-precision optical [...] Read more.
In robot-assisted ultrasound-guided needle biopsy, it is essential to conduct calibration of the ultrasound probe and to perform hand-eye calibration of the robot in order to establish a link between intra-operatively acquired ultrasound images and robot-assisted needle insertion. Based on a high-precision optical tracking system, novel methods for ultrasound probe and robot hand-eye calibration are proposed. Specifically, we first fix optically trackable markers to the ultrasound probe and to the robot, respectively. We then design a five-wire phantom to calibrate the ultrasound probe. Finally, an effective method taking advantage of steady movement of the robot but without an additional calibration frame or the need to solve the AX=XB equation is proposed for hand-eye calibration. After calibrations, our system allows for in situ definition of target lesions and aiming trajectories from intra-operatively acquired ultrasound images in order to align the robot for precise needle biopsy. Comprehensive experiments were conducted to evaluate accuracy of different components of our system as well as the overall system accuracy. Experiment results demonstrated the efficacy of the proposed methods. Full article
Show Figures

Figure 1

16 pages, 2873 KiB  
Article
A Novel Point Set Registration-Based Hand–Eye Calibration Method for Robot-Assisted Surgery
by Wenyuan Sun, Jihao Liu, Yuyun Zhao and Guoyan Zheng
Sensors 2022, 22(21), 8446; https://0-doi-org.brum.beds.ac.uk/10.3390/s22218446 - 03 Nov 2022
Cited by 3 | Viewed by 2068
Abstract
Pedicle screw insertion with robot assistance dramatically improves surgical accuracy and safety when compared with manual implantation. In developing such a system, hand–eye calibration is an essential component that aims to determine the transformation between a position tracking and robot-arm systems. In this [...] Read more.
Pedicle screw insertion with robot assistance dramatically improves surgical accuracy and safety when compared with manual implantation. In developing such a system, hand–eye calibration is an essential component that aims to determine the transformation between a position tracking and robot-arm systems. In this paper, we propose an effective hand–eye calibration method, namely registration-based hand–eye calibration (RHC), which estimates the calibration transformation via point set registration without the need to solve the AX=XB equation. Our hand–eye calibration method consists of tool-tip pivot calibrations in two-coordinate systems, in addition to paired-point matching, where the point pairs are generated via the steady movement of the robot arm in space. After calibration, our system allows for robot-assisted, image-guided pedicle screw insertion. Comprehensive experiments are conducted to verify the efficacy of the proposed hand–eye calibration method. A mean distance deviation of 0.70 mm and a mean angular deviation of 0.68° are achieved by our system when the proposed hand–eye calibration method is used. Further experiments on drilling trajectories are conducted on plastic vertebrae as well as pig vertebrae. A mean distance deviation of 1.01 mm and a mean angular deviation of 1.11° are observed when the drilled trajectories are compared with the planned trajectories on the pig vertebrae. Full article
Show Figures

Figure 1

13 pages, 2185 KiB  
Communication
Simultaneous Calibration of the Hand-Eye, Flange-Tool and Robot-Robot Relationship in Dual-Robot Collaboration Systems
by Yanding Qin, Pengxiu Geng, Bowen Lv, Yiyang Meng, Zhichao Song and Jianda Han
Sensors 2022, 22(5), 1861; https://0-doi-org.brum.beds.ac.uk/10.3390/s22051861 - 26 Feb 2022
Cited by 13 | Viewed by 2193
Abstract
A multi-robot collaboration system can complete more complex tasks than a single robot system. Ensuring the calibration accuracy between robots in the system is a prerequisite for the effective inter-robot cooperation. This paper presents a dual-robot system for orthopedic surgeries, where the relationships [...] Read more.
A multi-robot collaboration system can complete more complex tasks than a single robot system. Ensuring the calibration accuracy between robots in the system is a prerequisite for the effective inter-robot cooperation. This paper presents a dual-robot system for orthopedic surgeries, where the relationships between hand-eye, flange-tool, and robot-robot need to be calibrated. This calibration problem can be summarized to the solution of the matrix equation of AXB=YCZ. A combined solution is proposed to solve the unknown parameters in the equation of AXB=YCZ, which consists of the dual quaternion closed-form method and the iterative method based on Levenberg–Marquardt (LM) algorithm. The closed-form method is used to quickly obtain the initial value for the iterative method so as to increase the convergence speed and calibration accuracy of the iterative method. Simulation and experimental analyses are carried out to verify the accuracy and effectiveness of the proposed method. Full article
Show Figures

Figure 1

17 pages, 10206 KiB  
Article
A Projector-Based Augmented Reality Navigation System for Computer-Assisted Surgery
by Yuan Gao, Yuyun Zhao, Le Xie and Guoyan Zheng
Sensors 2021, 21(9), 2931; https://0-doi-org.brum.beds.ac.uk/10.3390/s21092931 - 22 Apr 2021
Cited by 10 | Viewed by 3627
Abstract
In the medical field, guidance to follow the surgical plan is crucial. Image overlay projection is a solution to link the surgical plan with the patient. It realizes augmented reality (AR) by projecting computer-generated image on the surface of the target through a [...] Read more.
In the medical field, guidance to follow the surgical plan is crucial. Image overlay projection is a solution to link the surgical plan with the patient. It realizes augmented reality (AR) by projecting computer-generated image on the surface of the target through a projector, which can visualize additional information to the scene. By overlaying anatomical information or surgical plans on the surgery area, projection helps to enhance the surgeon’s understanding of the anatomical structure, and intuitively visualizes the surgical target and key structures of the operation, and avoid the surgeon’s sight diversion between monitor and patient. However, it still remains a challenge to project the surgical navigation information on the target precisely and efficiently. In this study, we propose a projector-based surgical navigation system. Through the gray code-based calibration method, the projector can be calibrated with a camera and then be integrated with an optical spatial locator, so that the navigation information of the operation can be accurately projected onto the target area. We validated the projection accuracy of the system through back projection, with average projection error of 3.37 pixels in x direction and 1.51 pixels in y direction, and model projection with an average position error of 1.03 ± 0.43 mm, and carried out puncture experiments using the system with correct rate of 99%, and qualitatively analyzed the system’s performance through the questionnaire. The results demonstrate the efficacy of our proposed AR system. Full article
Show Figures

Figure 1

Back to TopTop