Next Article in Journal
Endometrial Cancer Staging: Is There Value in ADC?
Next Article in Special Issue
Early Identification and Treatment of Trochlear Knee Dysplasia
Previous Article in Journal
The Prognostic Role of the Immune Microenvironment in Sinonasal Intestinal-Type Adenocarcinoma: A Computer-Assisted Image Analysis of CD3+ and CD8+ Tumor-Infiltrating Lymphocytes
Previous Article in Special Issue
Patellofemoral Arthroplasty Is an Efficient Strategy for Isolated Patellofemoral Osteoarthritis with or without Robotic-Assisted System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Perspective

Integration of Square Fiducial Markers in Patient-Specific Instrumentation and Their Applicability in Knee Surgery

by
Vicente J. León-Muñoz
1,2,*,
Joaquín Moya-Angeler
1,2,
Mirian López-López
3,
Alonso J. Lisón-Almagro
1,
Francisco Martínez-Martínez
4 and
Fernando Santonja-Medina
4,5
1
Department of Orthopaedic Surgery and Traumatology, Hospital General Universitario Reina Sofía, 30003 Murcia, Spain
2
Instituto de Cirugía Avanzada de la Rodilla (ICAR), 30005 Murcia, Spain
3
Subdirección General de Tecnologías de la Información, Servicio Murciano de Salud, 30100 Murcia, Spain
4
Department of Orthopaedic Surgery and Traumatology, Hospital Clínico Universitario Virgen de la Arrixaca, 30120 Murcia, Spain
5
Department of Surgery, Pediatrics and Obstetrics & Gynecology, Faculty of Medicine, University of Murcia, 30120 Murcia, Spain
*
Author to whom correspondence should be addressed.
Submission received: 16 March 2023 / Revised: 23 April 2023 / Accepted: 23 April 2023 / Published: 25 April 2023
(This article belongs to the Special Issue Latest Advances in Musculoskeletal (Orthopedic) Surgery)

Abstract

:
Computer technologies play a crucial role in orthopaedic surgery and are essential in personalising different treatments. Recent advances allow the usage of augmented reality (AR) for many orthopaedic procedures, which include different types of knee surgery. AR assigns the interaction between virtual environments and the physical world, allowing both to intermingle (AR superimposes information on real objects in real-time) through an optical device and allows personalising different processes for each patient. This article aims to describe the integration of fiducial markers in planning knee surgeries and to perform a narrative description of the latest publications on AR applications in knee surgery. Augmented reality-assisted knee surgery is an emerging set of techniques that can increase accuracy, efficiency, and safety and decrease the radiation exposure (in some surgical procedures, such as osteotomies) of other conventional methods. Initial clinical experience with AR projection based on ArUco-type artificial marker sensors has shown promising results and received positive operator feedback. Once initial clinical safety and efficacy have been demonstrated, the continued experience should be studied to validate this technology and generate further innovation in this rapidly evolving field.

1. Introduction

Nowadays, computer technologies play a crucial role in orthopaedic surgery and are essential in personalising many treatments. In the last two decades, assistive technologies have been progressively developed to increase knee arthroplasties’ accuracy and reproducibility [1,2]. In the late 1990s, computer-assisted surgery (CAS) started to gain interest [2,3]. CAS replaces the surgeon’s visual references with real-time monitoring of the execution of the technique by computerized devices, with the primary objective of increasing geometric precision. With CAS, a mean error statistically less than 1 mm for determining single points or distances and less than 1° for determining angles (p < 0.001) has been estimated [4]. More recently, accelerometer-based navigation systems, which use sterile single-use devices within the operative field, have been employed in knee arthroplasty surgery [5,6]. The rapid and continuous development of different medical imaging modalities (mainly computed tomography and magnetic resonance imaging) and advanced technologies for the processing of these images has allowed the combination of radiological techniques with various CAD (Computer Aided Design) tools and CAD/CAM (Computer Aided Manufacturing) processes [7]. This synergy allows the planning of knee replacement surgery on three-dimensional (3D) virtual models and the design and additive manufacturing of patient-specific instrumentation (PSI). The latest assistive technology gaining huge interest is robotically assisted surgery in knee arthroplasty [2]. In addition, changes in alignment paradigms are furthering the rationale for the need for robotics and artificial intelligence-based tools in prosthetic knee surgery [8].
Along with robotics, several authors are implementing augmented reality (AR) in different aspects of total knee arthroplasty (TKA) surgery [9,10,11,12,13,14,15]. New platforms now use AR as a critical differentiator to enhance the surgeon’s experience during surgery. The integration of virtual reality (VR) and AR allows live and virtual images to be obtained in the robot-assisted user interface, facilitating the surgeon’s ability to position and manipulate robotic instruments [16].
Nevertheless, knee surgery is not limited to prosthetic surgery. Innovations have indeed been incorporated later in non-replacement surgeries. Recent advances allow the usage of AR for many orthopaedic procedures [17,18,19,20,21,22]. To the best of our knowledge, the first to use the term AR was Thomas P. Caudell, a Boeing researcher who coined the term in 1990 [23]. In 1992, Caudell and Mizell developed a prototype that allowed a computer-produced diagram to be superimposed and stabilized on a specific position on a real-world object [24]. Milgram and Kishino described in 1994 the overlap between the physical and digital worlds and placed AR in this reality–virtuality continuum [25]. Extended reality (XR) is a combination of the so-called immersive technologies: VR (virtual reality) + AR (augmented reality) + MR (Mixed Reality). AR assigns the interaction between virtual environments and the physical world, allowing both to intermingle (AR superimposes information on real objects in real time) through an optical device (usually smart glasses). An alternative for AR surgery is optical surgical navigation with ArUco-type artificial marker sensors. ArUco is a minimal library for AR applications based exclusively on OpenCV that relies on b/w markers with codes detected by calling a single function [26]. OpenCV (Open Source Computer Vision Library) is an open-source (an Apache 2 licensed product) computer vision and machine learning software library written natively in C++. OpenCV was built to provide a common infrastructure for computer vision applications and to accelerate the use of machine perception [27]. Some potential benefits of integrating this technology into knee surgery are (1) improvement in accuracy in comparison to traditional surgical techniques, (2) gain of the surgeon’s attention to the surgical field, (3) reduction of possible exposure to harmful radiation, (4) significant reduction of procedure time and costs associated with surgery, (5) improvement in operating room efficiency, (6) educational usefulness, (7) and the creation of positive synergies between engineering and medicine that favour the development of bioengineering and the pursuit of surgical excellence [17,19,28,29,30].
This article aims to provide an overview of the use of projection-based AR (digital content superimposed on a live video feed of the real world displayed on a monitor) with ArUco-type artificial marker sensors in knee surgical techniques other than knee replacement surgery, highlight the benefits and challenges associated with their usage, and provide a narrative review of the existing literature on AR applications in knee surgery.

2. Description of the Process, Design, and Implementation of the Devices

The process starts with the digitized acquisition of computed tomography (CT) images of the knee (or of the hip, knee, and ankle in cases requiring surgical axis correction). For the acquisition, the patient must be supine in the centre of the gantry. The leg of interest for the study should be in full extension. The digitized images are stored in the Digital Imaging and Communications in Medicine (DICOM) standard. The scans should be in the same coordinate system (reference frame). Each acquisition should be accurately centred and magnified to ensure that the field of view (FOV) maximizes the region of interest. The FOV should be as small as possible if it fully images the joint. Ideally, it should be less than 200 mm, leading to a pixel dimension (defined as the ratio of the FOV to the acquisition matrix) of less than 0.39 mm for an acquisition matrix of 512 × 512 pixels. The recommendation is a slice thickness of less than 1 mm in the knee region (in cases requiring hips and ankles, the recommendation is to slice between 2 and 4 mm in these areas). The space between cuts should be, at most, the thickness of the cut. We use a tube voltage of 120 kV or higher and a current of 50 to 75 mA.
The bioengineers obtain the 3D model of each specific knee to be operated on with its anatomical characteristics and particularities (in the case of CT studies of the bone morphology of the knee) using segmentation. According to some authors, segmentation requires a trained operator, as no fully automated segmentation algorithms exist [7,31]. Other authors, on the other hand, have published remarkable accuracy of automatic segmentation by implementing a neural network architecture based on deep learning [32,33,34]. In our usual practice, this process is carried out by the bioengineers using Mimics® software v.23 (Materialise, Leuven, Belgium) for the segmentation process. Mimics® is an interactive tool for the visualization and segmentation of CT and Magnetic Resonance Imaging (MRI) images and 3D rendering of objects. The operator imports files in DICOM format into Mimics® and selects from the series the images in which the bony structures are visualized with the least number of unnatural holes. To do so, he generates a mask for each series. The engineer selects the bone threshold and manually separates the different bone structures using the mask-splitting tool. The operator artificially selects each bone structure as a mask to be segmented, renames it, and assigns it a specific colour (to differentiate the structures and to be able to work with some of them independently from the others). The mask is then edited to reduce noise (image artifacts). The selected bone structures are converted to 3D models (software can output a virtual 3D reconstruction by algorithm analysis), and keeping the originals, they are optimized using different tools that smooth the model and reduce the number of triangles without altering the original bone geometry. Once finishing the process, all the information related to the 3D model is archived as an “.stl” file, the format commonly used for 3D printing and CAD.
For the planning process, the bioengineers of PQx Planificación Quirúrgica (PQx Planificación Quirúrgica, Murcia, Spain), a MedTech Start Up with which we plan our interventions, use the TopSolid’Design software (TopSolid, Évry, France). TopSolid is an integrated CAD/CAM software for designing and creating fully functional 3D parts. The bioengineer will plan the surgery with the different TopSolid tools on the 3D model according to the surgeon’s specifications. For knee surgery, especially for angular correction surgery of the limb, it is necessary to know the limb’s anatomy and mechanical particularities beforehand. The bioengineer draws the axes to determine the limb’s preoperative angular and planned values. These axes are obtained by defining the specific start and end points of the straight-line segments that define these axes and different reference points or coordinates of anatomical structures. Similarly to the way we use to obtain virtual models for planning replacement surgery [35,36], the bioengineer defines the centre of the hip as the centre of the sphere that delimits the femoral head. The centre of the distal femur is at the middle of the intercondylar notch, at the most distal point of the trochlear rim [35,36]. The straight-line segment between the point at the hip’s centre and the point at the middle of the intercondylar notch defines the femoral mechanical axis. The core of the proximal tibia is the midpoint of a line drawn between the intercondylar eminences (centre of the notch between the tibial spines), with an anterior displacement in the axial plane of 2 mm. The centre of the ankle or tibiotalar joint is a point at the middle of the line joining the most prominent part of the medial malleolus and the most prominent distal part of the lateral malleolus (the distal end or tip of the fibula). The straight-line segment between the point at the proximal tibial centre and the point at the core of the ankle defines the tibial mechanical axis. The angle formed in the coronal plane by the intersection of the femoral mechanical axis and the tibial mechanical axis describes the mechanical femorotibial angle or Hip-Knee-Ankle angle (HKA). In the virtual model, the operator identifies the most distal bony points of the medial and lateral condyles. The line joining these points (the tangent to the most distal ends of both condyles) defines the femoral articular axis. The angle formed in the coronal plane by the intersection of the femoral mechanical axis and the femoral joint axis defines the so-called lateral femoral distal angle or femoral mechanical angle determined on the lateral aspect. The deepest points of the centre of the medial and lateral tibial plates are defined. The line joining these points defines the tibial joint axis or joint line. The angle formed in the coronal plane on the medial aspect by the intersection of the mechanical tibial axis and the tibial joint axis defines the proximal or mechanical tibial angle. At the medial tibial plateau level, two reference points (one anterior and one posterior, excluding possible osteophytes) are determined to draw a line defining the tibial slope concerning the tibial mechanical axis in the sagittal plane. After determining the native angular characteristics of the limb, the engineer plans the intervention to be performed with the values proposed by the surgeon.
The PQx engineers use Python 3.8.3 (Python Software Foundation, Wilmington, NC, USA) as the programming language to develop the marker detection software and OpenCV 4.0.1 as the computer vision library. The engineers use Unity 2019.2.17f1 (Unity Software Inc., San Francisco, CA, USA) for the graphics engine. The method used to calculate distances is the one proposed by Vector3.Distance: the system receives two points, a and b, in three dimensions and performs the magnitude operation (a-b); the magnitude operation being ( ( a x b x ) ^ 2 + ( a y b y ) ^ 2 + ( a z b z ) ^ 2 ) .
The angular calculation method used is the one proposed by Vector3.Angle: the system receives two three-dimensional vectors representing the two directions whose degree difference is to be found. These vectors are normalized, and then the scalar product is performed; the result is restricted between −1 and 1, its arccosine obtained and multiplied by 57.29578 (180/pi) to transform radians to degrees returning the result in this magnitude.
Once the surgeon validates the planning, the ArUco markers, anatomical models, and cutting or tunnelling templates are produced using a 3D printing system [37]. The bioengineers at PQx currently use the Ultimaker S3 printers (Ultimaker BV, Utrecht, The Netherlands) with double extrusion and a print volume of 230 mm × 190 mm × 200 mm. The printing is done on Ultimaker polylactic acid (PLA) filament (Ultimaker BV, Geldermalsen, The Netherlands).
The planning and navigation system we have used combines virtual and AR. Planning and calculations are performed on the virtual model. The markers are needed as references to project the virtual planning onto AR. AR in isolation does not provide a basis in the three axes of space for precise measurements from point A (x, y, z) to point B (x, y, z).
During surgery, the optical camera will detect the markers and capture the images of the real world. The physical guide points are the reference to show the virtual world through the screen of our device. The processor is the element in charge of combining the images to form the AR through the real images and the information from the virtual world. A specific software controls the processor. The software is the computing or logical element that manages all the previous processes of the camera and the processor. The return in the form of an image will be a superimposition of the virtual over the real [38].
We have employed for our earliest surgeries the Logitech C920 Full HD (1080p at 30 fps) webcam (Logitech, Lausanne, Switzerland) with a diagonal field of view (dFoV) of 78° and automatic HD illumination correction. We have recently switched to OAK-D (A00110-INTL) cameras (Luxonis Holding Corporation, Denver, United States) for their improved features. The OAK-D baseboard has three on-board cameras which implement stereo and RGB vision, piped directly into the OAK system on modules for depth and Artificial Intelligence processing, with a dFoV of 81° and a resolution of 12MP (4032 × 3040).
To date, we have used this type of technology to perform a variety of knee surgeries: femoral and tibial osteotomies (for the correction of varus, valgus, or rotational deformities), primary and revision surgery of the anterior and posterior cruciate ligaments, and medial patellofemoral ligament reconstruction, meniscal transplants, and different combinations of these procedures. Figure 1, Figure 2 and Figure 3 show the planning of a tibial opening wedge osteotomy and targeted tunnelling for the posterior and anterior anchorage of medial meniscal transplantation for an incipient medial femorotibial osteoarthritis in a young patient with a previous meniscectomy. Figure 4 and Figure 5 show the planning of a de-rotational osteotomy to correct excessive femoral anteversion in a previous hip replacement revision surgery where femoral stem revision was not feasible.

3. A Narrative Review on AR Applications in Knee Surgery

We can divide the publications that refer to the application of AR in knee surgery into those related to replacement surgery, those related to learning based on immersive technologies, arthroscopic surgery assistance (such as ligament surgery), remote surgery assistance, techniques related to the rehabilitation process and gait analysis studies.
Most articles addressing AR’s usefulness in knee surgery refer to replacement surgery. For example, Daniel and Ramos presented 2016 the design of a TKA surgical assistance system, using MRI to build 3D models of the tibia and femur and AR to visualize the bone osteotomy to be performed and compared with the one planned on the model, highlighting the cost–benefit ratio of the implemented system [39]. Pokhrel et al. [40] proposed a new matching approach during the iteration of pair point matching to minimize the error metric in the iterative closest point (ICP) (algorithm employed to reduce the difference between two clouds of points). This results in a reduction of the cutting error by about 1 mm and improves the image processing time. Therefore, the accuracy parameters published with AR are similar to those published with CAS [4]. Tsukada et al. [15] presented a preclinical pilot study to evaluate the accuracy of coronal, sagittal, and rotational alignment in tibial bone resection during TKA assisted by an AR system. The authors reported results indicating that the evaluated AR system had comparable accuracy to conventional navigation systems in varus/valgus, posterior slope, and internal/external rotation angles [15]. Subsequently, Tsukada et al. [10] published the accuracy of distal femoral resection during TKA, with a pilot study on femoral SawBone specimens and a clinical study comparing the AR system with the conventional intramedullary guide. The authors reported a mean error in distal femoral resection of less than 0.1° in the coronal and sagittal planes with the AR system in the experimental setting and significantly higher accuracy during distal resection in TKA compared to conventional intramedullary guide in the clinical setting [10]. Another exciting aspect offered by AR is surgical visualization. For example, Wang et al. [41] published the usability of the Microsoft HoloLens-based AR navigation system for minimally invasive TKA with real-time intuitive surgical visualization, overlaying the virtual model in the field of view accurately via holographic space calibration and image registration. Other authors [42] also suggested that AR technology will undoubtedly play an essential role in assisting joint replacement surgery, improving the precision of implantation with better intraoperative ergonomics and workflow without adding high extra cost to the procedure. Another aspect of great interest in replacement surgery is the ability to obtain real-time information on the stable or unstable behaviour of the native knee and the prosthesis during surgery. In this respect, Fucentese et al. [14] described an innovative AR-based surgical guidance system that measures the effect of prosthesis alignment intraoperatively and positioning on soft tissue balance (NextAR TKA, Medacta International SA, Castel San Pietro, Switzerland). It is the first AR-based guidance system officially cleared for use in TKA. Thus, by knowing the ligamentous insertions in x-y-z coordinate axes from, for example, DICOM images of CT scans using compact infrared sensors, the surgeon can have control over distances and information on the elongation–tension relationship in the AR projection. Iacono et al. [11] published a systematic review of the literature and a pilot clinical study of the use of AR for limb and component alignment in TKA. The authors found only two studies [15,43] concerned TKA; unfortunately, both were preclinical studies. In our opinion, the study by Fallavollita et al. [43] is broader. It deals with the applicability of AR C-arm for intraoperative assessment of the mechanical axis in all knee surgeries that may be of interest (e.g., axis correction osteotomies). Iacono et al. [11] presented preliminary results using Knee+ AR navigation intraoperative assistance for implant positioning with the help of AR glasses (Pixee Medical Company, Besançon, France). The Knee+ system consists of smart glasses worn by the surgeon, a laptop, and specific markers connected with tibial and femur resection guides [11]. Iterative Closest Point (ICP) algorithms were introduced in the early 1990s to register 3D range data to CAD models of objects. Maharjan et al. [13] proposed an ICP algorithm with bidirectional maximum correntropy. According to the authors, this algorithm helped improve registration and alignment, maximize the overlapping parts between two cloud points and eliminate the registration outcomes trapped in local minima. Furthermore, it has also removed the non-Gaussian noise, impulse noise, and outliers and provided high registration accuracy, helping in accurate visualization and navigation of the knee anatomy [13]. Recently, Su et al. [44] published a report on MR technology in preoperative planning and intraoperative navigation guidance for TKA.
Immersive learning outside the operating room is increasingly recognized as a valuable method of surgical training [45]. It is changing the model of surgical education credited to Dr W. Halsted of Johns Hopkins University in the nineteenth century [46] (a dogmatic model which requires trainees to spend long hours in the hospital to acquire the knowledge and mastery of skills necessary to advance in their training and autonomy as a surgeon) [9]. Some authors have used XR technologies for training in replacement surgery. Thus, Edwards et al. [47] have shown that immersive VR training improves scrub nurses’ understanding, technical skills, and efficiency in complex revision knee arthroplasty surgery. Zaid et al. [48] carried out a randomized controlled trial of trainees at a single, large academic centre performing a relatively complex and unfamiliar procedure (unicompartmental knee replacement); VR training demonstrated equivalent SawBone model surgical competence compared with the use of traditional technique guides, as measured by surgical time and the Objective Structured Assessment of Technical Skills scores. The authors suggested that VR technology could be considered an adjunct to traditional surgical preparation/training methods.
Studies on the applicability of XR in non-replacement knee surgery are scarce. As aforementioned, Fallavollita et al. [43] study aims to evaluate a new AR technology for determining lower extremity alignment. The authors obtained a Pearson’s R correlation coefficient value of 0.979, demonstrating a strong positive correlation between the camera-augmented mobile C-arm (AR technology) and the ground-truth CT data. The result of the analysis of variance showed that the differences in clinician experience were not significant.
AR has begun to be used to improve the accuracy of knee ligament repair surgeries. Bone tunnel placement is critical in avoiding complications and improving functional outcomes in knee ligament surgery. Preoperative planning can improve tunnel positioning [49]. However, translating the preoperative plan into the surgical procedure can be challenging. Guo et al. [50] proposed a bone-characteristics-based 2D–3D registration method for an anterior cruciate ligament (ACL) reconstruction navigation system, which could provide better guidance for the accuracy of tunnel placement. The authors concluded that a combination of splatting (for digitally reconstructed radiograph generation), Spearman’s rank correlation coefficient (for measurement of similarity between two images), and gradient descent (in iterative optimization for finding the maximum similarity of two images), provided the best composite performance for the AR ACL reconstruction navigation system. The accuracy of the navigation system could fulfil the clinical needs of ACL reconstruction, with an end pose error of 2.5 mm and an angle error of 2.74° [50].
Chen et al. [51] described an AR navigation assistance for arthroscopic knee surgery. The authors proposed a model deformation method based on tissue properties to update the preoperative 3D tissue structure according to the intraoperative arthroscopic view. In addition, they generated virtual arthroscopic images from the updated preoperative model to provide the anatomical information of the operation site [51].
AR has also been used to enable the interconnectivity between different actors in distant locations in the same surgery. Thus, van der Putten et al. [52] demonstrated the feasibility of providing remote telesurgical support with the HoloLens 2 head-mounted lens and the Microsoft Dynamics 365 Remote Assist software (Microsoft, Redmond, Washington, DC, USA) in TKA surgery. A proof-of-concept team consisting of an orthopaedic surgeon, the manufacturer’s account manager, and a proof-of-concept lead assisted the surgical team. Through MR, they solved a TKA surgery that required an implant with which the surgical team was unfamiliar, with a real-time resolution of the step-by-step intervention and the difficulties encountered.
Another area related to knee surgery in which the potential usefulness of immersive technologies has been published is rehabilitation. Thus, for example, Blasco et al. [53] published a systematic review that included six trials to assess the effects of training with virtual reality tools during the rehabilitation of patients after knee surgery. The authors highlighted the usefulness of improving balance and the lack of advantages over conventional rehabilitation in improving function, resolving pain, or increasing patient satisfaction after surgery [53]. Fung et al. [54] determined in a preliminary randomized controlled trial whether Nintendo Wii Fit™ (Nintendo of America Inc, Redmond, Washington, DC, USA) is an acceptable adjunct to physiotherapy treatment in the rehabilitation of balance, lower extremity movement, strength and function in outpatients following TKA. Nintendo Wii Fit™ includes a balance board similar in concept to a force plate. There were no significant differences in pain, knee range of motion, walking speed, timed standing tasks, self-perceived balance confidence, and self-perceived lower extremity function between the groups. Piqueras et al. [55] compared with a single-blind, randomized, controlled non-inferiority trial the effectiveness of a new interactive virtual telerehabilitation system and a conventional program following TKA. The virtual telerehabilitation system consists of interactive software with a 3D avatar demonstrating the exercises to be undertaken, sensors (3-axis accelerometer and gyroscopes) connected to the patient allowing calculation of their movement trajectories, and a web portal for the therapist. The physiotherapist receives data and records them for evaluation, with the option to modify the therapy as the rehabilitation evolves. The authors concluded that the use of interactive virtual telerehabilitation following TKA surgery was equally as effective as conventional treatment during a specific rehabilitation period [55]. Christiansen et al. [56] performed a randomized controlled trial with blinded evaluators. The authors examined the effects of weight-bearing (WB) biofeedback training on WB symmetry and functional joint moments following unilateral total knee arthroplasty. They used the Nintendo Wii Fit Plus game associated with the Wii Balance Board (Nintendo of America Inc, Redmond, Washington, DC, USA). The addition of a six-week intervention of WB biofeedback training to standard rehabilitation post-TKA did not improve functional WB symmetry. The feasibility and safety of the Nintendo Wii game console were also evaluated by Ficklscherer et al. [57]. A prospective, randomized, controlled study was conducted comparing standard physiotherapy vs. physiotherapy plus game console training in patients with ACL repair or TKA. The authors demonstrated that physiotherapy using the Nintendo Wii gaming console does not negatively influence the outcome and see an opportunity whereby additional training with a gaming console for an extended time could lead to even better results. Chung-Ho Su [58] developed a Kinect-based rehabilitation system using a Microsoft Kinect™ platform (a motion-sensing input device produced by Microsoft for the Xbox 360 video game console and Windows PCs) to assist the rehabilitating TKR patients. The author performed a study with a quasi-experimental design. The experimental group showed significant improvement in knee flexion. Chung-Ho Su concluded that motivation influences the effectiveness of rehabilitation and that the Kinect-based rehabilitation system acceptance by rehabilitates was indeed able to strengthen their motivation and improve their rehabilitation outcomes [58]. Roig-Casasús et al. [59] evaluated the influence of specific balance-targeted training using a dynamometric platform (Balance System™ SD, Biodex Medical Systems Inc, Shirley, New York, NY, USA) on the overall state of balance in patients undergoing TKA. The authors carried out a prospective, randomized 2-arm clinical trial. They demonstrated that a 4-week functional training protocol that included a dynamometric platform in the training methods enhanced the balance performance of older adults in the early TKR postoperative stage to a greater extent than a traditional functional training protocol [59].
Recently, Berton et al. [60] described the state of the art in VR, AR, gamification, and telerehabilitation for orthopaedic rehabilitation. The authors underlined the need for future research to demonstrate the advantages of these technologies compared to face-to-face orthopaedic rehabilitation. Lingfeng Li [61] randomly divided 88 patients who underwent knee joint injury surgery into an experimental group (treated with AR-based rehabilitation after the surgery) and a control group (treated with traditional rehabilitation). Li stated that, compared with conventional methods, the rehabilitation training method based on AR showed substantial advantages in alleviating postoperative pain and helping structural and functional recovery [61].
Another area where applications of immersive technologies are being developed is gait analysis. A conventional optical-based gait analysis laboratory employs expensive stereophotogrammetric motion capture systems. Several studies [62,63,64] proposed the application of VR/AR devices to carry out gait analysis, significantly reducing the cost, with similar performance to conventional technologies.

4. Present and Future Directions

Due to its technical characteristics with simple computer developments, the optical technological expansion achieved, and the economic sustainability it allows, the combination of three-dimensional planning on virtual models and the visual representation in AR of these models in real-time in different knee surgery techniques using fiducial markers from the ArUco library can be beneficial in the different aspects that we previously mentioned as desirable: increased precision to traditional surgical techniques, increased surgeon’s attention in the surgical field, reduction of possible radiation exposure, reduction of procedure time and costs, improvement of operating room efficiency, teaching benefit and the creation of positive synergies between engineering and medicine that favour the development of bioengineering. All these aspects are on the way to the pursuit of surgical excellence, which must positively impact outcomes and improve patients’ health-related quality of life. We can see the result of the axial correction of a tibial osteotomy in all three planes before performing it. We can project the trajectories of multi-ligament lesion tunnels in the knee before drilling into the bone. We can know how deep an osteotome penetrates a femur’s metaphysis without using the X-ray image intensifier. Today, it is possible.
It is a technological innovation incorporated into surgical activity and is present in our knee surgeries. There are still several aspects to demonstrate, such as its usefulness, accuracy, efficiency, and economic sustainability. We are working on different aspects related to AR, such as the accuracy of ArUco markers in vitro. In this modernity in which we move in recent robotic knee surgery, with extended realities in that continuum between virtuality and reality [25] and artificial intelligence, everything will make sense if it finally contributes to augmented humanity, as defined by Guerrero et al. [65]: “Augmented humanity is a human–computer integration technology that proposes to improve capacity and productivity by changing or increasing the normal ranges of human function, through the restoration or extension of human physical, intellectual and social capabilities”.

Author Contributions

Conceptualization, V.J.L.-M., J.M.-A. and A.J.L.-A.; validation, V.J.L.-M., J.M.-A., A.J.L.-A. and F.S.-M.; data curation, M.L.-L.; writing—original draft preparation, V.J.L.-M.; writing—review and editing, V.J.L.-M., J.M.-A. and M.L.-L.; visualization, V.J.L.-M. and M.L.-L.; supervision, F.M.-M. and F.S.-M.; project administration, V.J.L.-M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gao, J.; Dong, S.; Li, J.J.; Ge, L.; Xing, D.; Lin, J. New technology-based assistive techniques in total knee arthroplasty: A Bayesian network meta-analysis and systematic review. Int. J. Med. Robot. Comput. Assist. Surg. 2020, 17, e2189. [Google Scholar] [CrossRef] [PubMed]
  2. Batailler, C.; Parratte, S. Assistive technologies in knee arthroplasty: Fashion or evolution? Rate of publications and national registries prove the Scott Parabola wrong. Arch. Orthop. Trauma Surg. 2021, 141, 2027–2034. [Google Scholar] [CrossRef]
  3. Amiot, L.-P.; Poulin, F. Computed Tomography-Based Navigation for Hip, Knee, and Spine Surgery. Clin. Orthop. Relat. Res. 2004, 421, 77–86. [Google Scholar] [CrossRef] [PubMed]
  4. Lustig, S.; Fleury, C.; Goy, D.; Neyret, P.; Donell, S.T. The accuracy of acquisition of an imageless computer-assisted system and its implication for knee arthroplasty. Knee 2011, 18, 15–20. [Google Scholar] [CrossRef]
  5. Jones, C.W.; Jerabek, S.A. Current Role of Computer Navigation in Total Knee Arthroplasty. J. Arthroplast. 2018, 33, 1989–1993. [Google Scholar] [CrossRef]
  6. Nam, D.; Cody, E.A.; Nguyen, J.T.; Figgie, M.P.; Mayman, D.J. Extramedullary Guides Versus Portable, Accelerometer-Based Navigation for Tibial Alignment in Total Knee Arthroplasty: A Randomized, Controlled Trial: Winner of the 2013 HAP PAUL Award. J. Arthroplast. 2014, 29, 288–294. [Google Scholar] [CrossRef]
  7. Kim, G.B.; Lee, S.; Kim, H.; Yang, D.H.; Kim, Y.-H.; Kyung, Y.S.; Kim, C.-S.; Choi, S.H.; Kim, B.J.; Ha, H.; et al. Three-Dimensional Printing: Basic Principles and Applications in Medicine and Radiology. Korean J. Radiol. 2016, 17, 182. [Google Scholar] [CrossRef]
  8. Batailler, C.; Shatrov, J.; Sappey-Marinier, E.; Servien, E.; Parratte, S.; Lustig, S. Artificial intelligence in knee arthroplasty: Current concept of the available clinical applications. Arthroplasty 2022, 4, 17. [Google Scholar] [CrossRef] [PubMed]
  9. Alpaugh, K.; Ast, M.P.; Haas, S.B. Immersive technologies for total knee arthroplasty surgical education. Arch. Orthop. Trauma Surg. 2021, 141, 2331–2335. [Google Scholar] [CrossRef]
  10. Tsukada, S.; Ogawa, H.; Nishino, M.; Kurosaka, K.; Hirasawa, N. Augmented Reality-Assisted Femoral Bone Resection in Total Knee Arthroplasty. JB JS Open Access 2021, 6, e21.00001. [Google Scholar] [CrossRef] [PubMed]
  11. Iacono, V.; Farinelli, L.; Natali, S.; Piovan, G.; Screpis, D.; Gigante, A.; Zorzi, C. The use of augmented reality for limb and component alignment in total knee arthroplasty: Systematic review of the literature and clinical pilot study. J. Exp. Orthop. 2021, 8, 52. [Google Scholar] [CrossRef]
  12. Goh, G.S.; Lohre, R.; Parvizi, J.; Goel, D.P. Virtual and augmented reality for surgical training and simulation in knee arthroplasty. Arch. Orthop. Trauma Surg. 2021, 141, 2303–2312. [Google Scholar] [CrossRef]
  13. Maharjan, N.; Alsadoon, A.; Prasad, P.W.C.; Abdullah, S.; Rashid, T.A. A novel visualization system of using augmented reality in knee replacement surgery: Enhanced bidirectional maximum correntropy algorithm. Int. J. Med. Robot. 2021, 17, e2223. [Google Scholar] [CrossRef] [PubMed]
  14. Fucentese, S.F.; Koch, P.P. A novel augmented reality-based surgical guidance system for total knee arthroplasty. Arch. Orthop. Trauma Surg. 2021, 141, 2227–2233. [Google Scholar] [CrossRef]
  15. Tsukada, S.; Ogawa, H.; Nishino, M.; Kurosaka, K.; Hirasawa, N. Augmented reality-based navigation system applied to tibial bone resection in total knee arthroplasty. J. Exp. Orthop. 2019, 6, 44. [Google Scholar] [CrossRef]
  16. Bagaria, V.; Sadigale, O.S.; Pawar, P.P.; Bashyal, R.K.; Achalare, A.; Poduval, M. Robotic-Assisted Knee Arthroplasty (RAKA): The Technique, the Technology and the Transition. Indian J. Orthop. 2020, 54, 745–756. [Google Scholar] [CrossRef] [PubMed]
  17. Ha, J.; Parekh, P.; Gamble, D.; Masters, J.; Jun, P.; Hester, T.; Daniels, T.; Halai, M. Opportunities and challenges of using augmented reality and heads-up display in orthopaedic surgery: A narrative review. J. Clin. Orthop. Trauma 2021, 18, 209–215. [Google Scholar] [CrossRef] [PubMed]
  18. Casari, F.A.; Navab, N.; Hruby, L.A.; Kriechling, P.; Nakamura, R.; Tori, R.; de Lourdes Dos Santos Nunes, F.; Queiroz, M.C.; Fürnstahl, P.; Farshad, M. Augmented Reality in Orthopedic Surgery Is Emerging from Proof of Concept Towards Clinical Studies: A Literature Review Explaining the Technology and Current State of the Art. Curr. Rev. Musculoskelet. Med. 2021, 14, 192–203. [Google Scholar] [CrossRef] [PubMed]
  19. Furman, A.A.; Hsu, W.K. Augmented Reality (AR) in Orthopedics: Current Applications and Future Directions. Curr. Rev. Musculoskelet. Med. 2021, 14, 397–405. [Google Scholar] [CrossRef]
  20. Jud, L.; Fotouhi, J.; Andronic, O.; Aichmair, A.; Osgood, G.; Navab, N.; Farshad, M. Applicability of augmented reality in orthopedic surgery—A systematic review. BMC Musculoskelet. Disord. 2020, 21, 103. [Google Scholar] [CrossRef]
  21. Matthews, J.H.; Shields, J.S. The Clinical Application of Augmented Reality in Orthopaedics: Where Do We Stand? Curr. Rev. Musculoskelet. Med. 2021, 14, 316–319. [Google Scholar] [CrossRef] [PubMed]
  22. Laverdière, C.; Corban, J.; Khoury, J.; Ge, S.M.; Schupbach, J.; Harvey, E.J.; Reindl, R.; Martineau, P.A. Augmented reality in orthopaedics: A systematic review and a window on future possibilities. Bone Jt. J. 2019, 101-B, 1479–1488. [Google Scholar] [CrossRef]
  23. Lee, K. Augmented Reality in Education and Training. TechTrends 2012, 56, 13–21. [Google Scholar] [CrossRef]
  24. Caudell, T.P.; Mizell, D.W. Augmented reality: An application of heads-up display technology to manual manufacturing processes. In Proceedings of the Twenty-Fifth Hawaii International Conference on System Sciences, Kauai, HI, USA, 7–10 January 1992; IEEE: Manhattan, NY, USA, 1992; Volume 2, pp. 659–669. [Google Scholar]
  25. Milgram, P.; Kishino, F. A taxonomy of mixed reality visual displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
  26. Garrido-Jurado, S.; Muñoz-Salinas, R.; Madrid-Cuevas, F.J.; Marín-Jiménez, M.J. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit. 2014, 47, 2280–2292. [Google Scholar] [CrossRef]
  27. About OpenCV. Available online: https://opencv.org/about/ (accessed on 21 November 2022).
  28. Ortega, G.; Wolff, A.; Baumgaertner, M.; Kendoff, D. Usefulness of a head mounted monitor device for viewing intraoperative fluoroscopy during orthopaedic procedures. Arch. Orthop. Trauma Surg. 2008, 128, 1123–1126. [Google Scholar] [CrossRef]
  29. Chimenti, P.C.; Mitten, D.J. Google Glass as an Alternative to Standard Fluoroscopic Visualization for Percutaneous Fixation of Hand Fractures: A Pilot Study. Plast. Reconstr. Surg. 2015, 136, 328–330. [Google Scholar] [CrossRef]
  30. Keating, T.C.; Jacobs, J.J. Augmented Reality in Orthopedic Practice and Education. Orthop. Clin. N. Am. 2021, 52, 15–26. [Google Scholar] [CrossRef]
  31. Ma, L.; Fan, Z.; Ning, G.; Zhang, X.; Liao, H. 3D Visualization and Augmented Reality for Orthopedics. Adv. Exp. Med. Biol. 2018, 1093, 193–205. [Google Scholar] [CrossRef] [PubMed]
  32. Song, P.; Fan, Z.; Zhi, X.; Cao, Z.; Min, S.; Liu, X.; Zhang, Y.; Kong, X.; Chai, W. Study on the accuracy of automatic segmentation of knee CT images based on deep learning. Zhongguo Xiu Fu Chong Jian Wai Ke Za Zhi = Zhongguo Xiufu Chongjian Waike Zazhi=Chin. J. Reparative Reconstr. Surg. 2022, 36, 534–539. [Google Scholar] [CrossRef]
  33. Deng, Y.; Wang, L.; Zhao, C.; Tang, S.; Cheng, X.; Deng, H.-W.; Zhou, W. A deep learning-based approach to automatic proximal femur segmentation in quantitative CT images. Med. Biol. Eng. Comput. 2022, 60, 1417–1429. [Google Scholar] [CrossRef]
  34. Tang, X.; Li, X.; Gu, X.; Zhao, Y.; Liu, A.; Liu, Y.; Tao, Y. Automatic modeling of the knee joint based on artificial intelligence. Zhongguo Xiu Fu Chong Jian Wai Ke Za Zhi = Zhongguo Xiufu Chongjian Waike Zazhi=Chin. J. Reparative Reconstr. Surg. 2023, 37, 348–352. [Google Scholar] [CrossRef]
  35. León-Muñoz, V.J.; Manca, S.; López-López, M.; Martínez-Martínez, F.; Santonja-Medina, F. Coronal and axial alignment relationship in Caucasian patients with osteoarthritis of the knee. Sci. Rep. 2021, 11, 7836. [Google Scholar] [CrossRef]
  36. León-Muñoz, V.J.; López-López, M.; Martínez-Martínez, F.; Santonja-Medina, F. Comparison of weight-bearing full-length radiographs and computed-tomography-scan-based three-dimensional models in the assessment of knee joint coronal alignment. Knee 2020, 27, 543–551. [Google Scholar] [CrossRef] [PubMed]
  37. Liu, D.; Li, Y.; Li, T.; Yu, Y.; Cai, G.; Yang, G.; Wang, G. The use of a 3D-printed individualized navigation template to assist in the anatomical reconstruction surgery of the anterior cruciate ligament. Ann. Transl. Med. 2020, 8, 1656. [Google Scholar] [CrossRef]
  38. Zhou, F.; Duh, H.B.-L.; Billinghurst, M. Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR. In Proceedings of the 2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, Washington, DC, USA, 15–18 September 2008; pp. 193–202. [Google Scholar]
  39. Daniel, C.; Ramos, O. Augmented Reality for Assistance of Total Knee Replacement. J. Electr. Comput. Eng. 2016, 2016, 9358369. [Google Scholar] [CrossRef]
  40. Pokhrel, S.; Alsadoon, A.; Prasad, P.W.C.; Paul, M. A novel augmented reality (AR) scheme for knee replacement surgery by considering cutting error accuracy. Int. J. Med. Robot. 2019, 15, e1958. [Google Scholar] [CrossRef]
  41. Wang, L.; Sun, Z.; Zhang, X.; Sun, Z.; Wang, J. A HoloLens Based Augmented Reality Navigation System for Minimally Invasive Total Knee Arthroplasty BT—Intelligent Robotics and Applications; Yu, H., Liu, J., Liu, L., Ju, Z., Liu, Y., Zhou, D., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 519–530. [Google Scholar]
  42. Auvinet, E.; Maillot, C.; Uzoho, C. Augmented Reality Technology for Joint Replacement. In Personalized Hip and Knee Joint Replacement; Springer International Publishing: Cham, Switzerland, 2020; pp. 321–328. [Google Scholar]
  43. Fallavollita, P.; Brand, A.; Wang, L.; Euler, E.; Thaller, P.; Navab, N.; Weidert, S. An augmented reality C-arm for intraoperative assessment of the mechanical axis: A preclinical study. Int. J. Comput. Assist. Radiol. Surg. 2016, 11, 2111–2117. [Google Scholar] [CrossRef]
  44. Su, S.; Lei, P.; Wang, C.; Gao, F.; Zhong, D.; Hu, Y. Mixed Reality Technology in Total Knee Arthroplasty: An Updated Review With a Preliminary Case Report. Front. Surg. 2022, 9, 804029. [Google Scholar] [CrossRef] [PubMed]
  45. McKnight, R.R.; Pean, C.A.; Buck, J.S.; Hwang, J.S.; Hsu, J.R.; Pierrie, S.N. Virtual Reality and Augmented Reality-Translating Surgical Training into Surgical Technique. Curr. Rev. Musculoskelet. Med. 2020, 13, 663–674. [Google Scholar] [CrossRef]
  46. Eberlein, T.J. A new paradigm in surgical training. J. Am. Coll. Surg. 2014, 218, 511–518. [Google Scholar] [CrossRef]
  47. Edwards, T.C.; Patel, A.; Szyszka, B.; Coombs, A.W.; Liddle, A.D.; Kucheria, R.; Cobb, J.P.; Logishetty, K. Immersive virtual reality enables technical skill acquisition for scrub nurses in complex revision total knee arthroplasty. Arch. Orthop. Trauma Surg. 2021, 141, 2313–2321. [Google Scholar] [CrossRef]
  48. Zaid, M.B.; Dilallo, M.; Shau, D.; Ward, D.T.; Barry, J.J. Virtual Reality as a Learning Tool for Trainees in Unicompartmental Knee Arthroplasty: A Randomized Controlled Trial. J. Am. Acad. Orthop. Surg. 2022, 30, 84–90. [Google Scholar] [CrossRef] [PubMed]
  49. Morita, K.; Nii, M.; Koh, M.-S.; Kashiwa, K.; Nakayama, H.; Kambara, S.; Yoshiya, S.; Kobashi, S. Bone Tunnel Placement Determination Method for 3D Images and Its Evaluation for Anterior Cruciate Ligament Reconstruction. Curr. Med. Imaging 2020, 16, 491–498. [Google Scholar] [CrossRef]
  50. Guo, N.; Yang, B.; Ji, X.; Wang, Y.; Hu, L.; Wang, T. Intensity-based 2D-3D registration for an ACL reconstruction navigation system. Int. J. Med. Robot. 2019, 15, e2008. [Google Scholar] [CrossRef] [PubMed]
  51. Chen, F.; Cui, X.; Han, B.; Liu, J.; Zhang, X.; Liao, H. Augmented reality navigation for minimally invasive knee surgery using enhanced arthroscopy. Comput. Methods Programs Biomed. 2021, 201, 105952. [Google Scholar] [CrossRef]
  52. van der Putten, K.; Anderson, M.B.; van Geenen, R.C. Looking through the Lens: The Reality of Telesurgical Support with Interactive Technology Using Microsoft’s HoloLens 2. Case Rep. Orthop. 2022, 2022, 5766340. [Google Scholar] [CrossRef] [PubMed]
  53. Blasco, J.; Igual-Camacho, C.; Blasco, M.; Antón-Antón, V.; Ortiz-Llueca, L.; Roig-Casasús, S. The efficacy of virtual reality tools for total knee replacement rehabilitation: A systematic review. Physiother. Theory Pract. 2021, 37, 682–692. [Google Scholar] [CrossRef]
  54. Fung, V.; Ho, A.; Shaffer, J.; Chung, E.; Gomez, M. Use of Nintendo Wii FitTM in the rehabilitation of outpatients following total knee replacement: A preliminary randomised controlled trial. Physiotherapy 2012, 98, 183–188. [Google Scholar] [CrossRef]
  55. Piqueras, M.; Marco, E.; Coll, M.; Escalada, F.; Ballester, A.; Cinca, C.; Belmonte, R.; Muniesa, J.M. Effectiveness of an interactive virtual telerehabilitation system in patients after total knee arthoplasty: A randomized controlled trial. J. Rehabil. Med. 2013, 45, 392–396. [Google Scholar] [CrossRef]
  56. Christiansen, C.L.; Bade, M.J.; Davidson, B.S.; Dayton, M.R.; Stevens-Lapsley, J.E. Effects of Weight-Bearing Biofeedback Training on Functional Movement Patterns Following Total Knee Arthroplasty: A Randomized Controlled Trial. J. Orthop. Sport. Phys. Ther. 2015, 45, 647–655. [Google Scholar] [CrossRef] [PubMed]
  57. Ficklscherer, A.; Stapf, J.; Meissner, K.M.; Niethammer, T.; Lahner, M.; Wagenhäuser, M.; Müller, P.E.; Pietschmann, M.F. Testing the feasibility and safety of the Nintendo Wii gaming console in orthopedic rehabilitation: A pilot randomized controlled study. Arch. Med. Sci. 2016, 12, 1273–1278. [Google Scholar] [CrossRef] [PubMed]
  58. Su, C.-H. Developing and evaluating effectiveness of 3D game-based rehabilitation system for Total Knee Replacement Rehabilitation patients. Multimed. Tools Appl. 2016, 75, 10037–10057. [Google Scholar] [CrossRef]
  59. Roig-Casasús, S.; Blasco, J.M.; López-Bueno, L.; Blasco-Igual, M.C. Balance Training With a Dynamometric Platform Following Total Knee Replacement: A Randomized Controlled Trial. J. Geriatr. Phys. Ther. 2018, 41, 204–209. [Google Scholar] [CrossRef]
  60. Berton, A.; Longo, U.G.; Candela, V.; Fioravanti, S.; Giannone, L.; Arcangeli, V.; Alciati, V.; Berton, C.; Facchinetti, G.; Marchetti, A.; et al. Virtual Reality, Augmented Reality, Gamification, and Telerehabilitation: Psychological Impact on Orthopedic Patients’ Rehabilitation. J. Clin. Med. 2020, 9, 2567. [Google Scholar] [CrossRef]
  61. Li, L. Effect of Remote Control Augmented Reality Multimedia Technology for Postoperative Rehabilitation of Knee Joint Injury. Comput. Math. Methods Med. 2022, 2022, 9320063. [Google Scholar] [CrossRef] [PubMed]
  62. Chan, Z.Y.S.; MacPhail, A.J.C.; Au, I.P.H.; Zhang, J.H.; Lam, B.M.F.; Ferber, R.; Cheung, R.T.H. Walking with head-mounted virtual and augmented reality devices: Effects on position control and gait biomechanics. PLoS ONE 2019, 14, e0225972. [Google Scholar] [CrossRef]
  63. Nagymáté, G.; Kiss, R.M. Affordable gait analysis using augmented reality markers. PLoS ONE 2019, 14, e0212319. [Google Scholar] [CrossRef]
  64. Braga Rodrigues, T.; Ó Catháin, C.; O’Connor, N.E.; Murray, N. A Quality of Experience assessment of haptic and augmented reality feedback modalities in a gait analysis system. PLoS ONE 2020, 15, e0230570. [Google Scholar] [CrossRef]
  65. Guerrero, G.; da Silva, F.J.M.; Fernández-Caballero, A.; Pereira, A. Augmented Humanity: A Systematic Mapping Review. Sensors 2022, 22, 514. [Google Scholar] [CrossRef]
Figure 1. Images of the virtual model with the planning of the tibial osteotomy and the tunnels for the anterior and posterior anchorage of the medial meniscal transplant.
Figure 1. Images of the virtual model with the planning of the tibial osteotomy and the tunnels for the anterior and posterior anchorage of the medial meniscal transplant.
Jpm 13 00727 g001
Figure 2. Detail of the jig design to perform the osteotomy with the ArUco markers.
Figure 2. Detail of the jig design to perform the osteotomy with the ArUco markers.
Jpm 13 00727 g002
Figure 3. Virtual modelling of the plate and screw placement avoiding the tunnel trajectory for anchoring the medial meniscal transplant.
Figure 3. Virtual modelling of the plate and screw placement avoiding the tunnel trajectory for anchoring the medial meniscal transplant.
Jpm 13 00727 g003
Figure 4. Details of the analysis of the rotational distortion on the virtual model and of the proposed corrections.
Figure 4. Details of the analysis of the rotational distortion on the virtual model and of the proposed corrections.
Jpm 13 00727 g004
Figure 5. Cutting jig with ArUco markers for guiding femoral osteotomy under AR control.
Figure 5. Cutting jig with ArUco markers for guiding femoral osteotomy under AR control.
Jpm 13 00727 g005
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

León-Muñoz, V.J.; Moya-Angeler, J.; López-López, M.; Lisón-Almagro, A.J.; Martínez-Martínez, F.; Santonja-Medina, F. Integration of Square Fiducial Markers in Patient-Specific Instrumentation and Their Applicability in Knee Surgery. J. Pers. Med. 2023, 13, 727. https://0-doi-org.brum.beds.ac.uk/10.3390/jpm13050727

AMA Style

León-Muñoz VJ, Moya-Angeler J, López-López M, Lisón-Almagro AJ, Martínez-Martínez F, Santonja-Medina F. Integration of Square Fiducial Markers in Patient-Specific Instrumentation and Their Applicability in Knee Surgery. Journal of Personalized Medicine. 2023; 13(5):727. https://0-doi-org.brum.beds.ac.uk/10.3390/jpm13050727

Chicago/Turabian Style

León-Muñoz, Vicente J., Joaquín Moya-Angeler, Mirian López-López, Alonso J. Lisón-Almagro, Francisco Martínez-Martínez, and Fernando Santonja-Medina. 2023. "Integration of Square Fiducial Markers in Patient-Specific Instrumentation and Their Applicability in Knee Surgery" Journal of Personalized Medicine 13, no. 5: 727. https://0-doi-org.brum.beds.ac.uk/10.3390/jpm13050727

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop