In orthopaedic oncology, surgical planning for a tumor-free resection of a bone tumour and reconstruction of the skeletal defect requires a detailed analysis of preoperative images. Before contemplating bone tumor surgery, Computer Tomography (CT) and Magnetic Resonance Imaging (MRI) scans are essential preoperative imaging studies. CT images provide the bony details, while MRI better indicates the extent of tumor involvement. Tumor surgeons have to mentally integrate all the preoperative two-dimensional (2D) imaging and formulate a three-dimensional (3D) plan for tumour resection with a negative margin along the desired plane. Also, surgical planning and intraoperative execution errors may result in positive tumor resection margins that increase the risk of local recurrence1–3 and thus adversely affect patients’ survival.
Translating the surgical planning to the actual surgical fields is challenging, particularly when the case is at complex anatomical regions such as the pelvis, sacrum, and spine, or when technically demanding resections such as joint-preserving resection or multi-planar resection are contemplated. Computer assistive tools like computer navigation4–11 and 3D-printed surgical resection guides12–16 have been reported to address the surgical inaccuracy by replicating the surgical plans. By reducing the number of positive-margin resections and increasing the accuracy of implant placement, navigation-assisted surgery was shown to improve the survival or implant alignment in patients with pelvic bone sarcoma.17,18 Criticisms about the navigation technology affect its adoption in clinical routine. They include the intrinsic incompatibility between the 2D image of computer-generated content and the 3D representation of the real physical world; the technology only works without line-of-sight interruption between navigation trackers and navigation camera;19 surgeons shift their attention and are distracted from the surgical fields when viewing the external navigation display;20 expensive navigation facilities;21 steep learning curve with the complicated surgical workflow.22 On the other hand, 3D-printed patient-specific guides have been described as a more intuitive and simpler instrument in providing equivalent surgical accuracy with less bone resection time.23,24 However, lacking real-time visual feedback of preoperative images and the lead-time in manufacturing 3D-printed objects are practical concerns.25 Therefore, both technology is not routinely utilized in simple bone resections but are limited to more difficult bone tumor surgery that involves complex anatomic sites or more technically demanding resections.21,26
Mixed Reality (MR) is a new technology of merging real and virtual worlds to produce new environments with enhanced visualizations, where physical and digital objects coexist and allow users to interact with both in real-time. Digital medical information, such as images with processed 3D models, surgical planning, or relevant patients data, can be superimposed and visualized in users’ immediate physical environment. In the operating rooms, real-time digital data is available in surgeons’ lines of sight to remain focused on their tasks in the surgical field. Combining modern computing power and advanced imaging modalities may provide a disruptive technology to improve patients’ care in surgical disciplines. As the MR technology is less costly and easier available with less lead-time when compared with the existing assistive tools (computer navigation or 3D-printed guides), it shows great promise to be more readily utilized in bone tumor surgery that requires image and 3D model visualization. Early application of MR technology has been reported in neurosurgical, oral maxillofacial, and orthopaedic procedures. Its role is unclear in orthopaedic oncology. As the MR technology grows and continues to evolve, orthopaedic oncology surgeons should embrace the latest knowledge of the emerging technology for possible clinical benefits to tumor patients. This review aims to provide orthopaedic tumor surgeons with up-to-date knowledge of the emerging MR technology. The paper 1) presents the background, application features, and clinical workflow in using MR technology, 2) reviews the current literature and potential clinical applications with case illustrations, 3) discusses the challenges and limitations, and 4) suggests the directions for future development in orthopaedic oncology.
Basic Principles of Mixed Reality Technology
Extended Reality (XR) technology refers to any situation in which real life is augmented by computer technology. It encompasses a set of reality-creating technologies such as Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) (Figure 1A and B). It offers users an immersive and engaging experience in their real physical environment through the integration or manipulation of computer-generated digital content. In VR, users fully immerse in a simulated digital environment and are isolated from the physical world. In AR, virtual digital information is added to and superimposed on the physical world to enhance users’ experience. MR, first defined in 1994 by Paul Milgram and Fumio Kishino,27 refers to merging real and virtual worlds to produce hybrid new environments and visualizations. Virtual and real objects coexist, and users can interact with both in real time.
Figure 1 (A) shows the spectrum of Extended Reality (XR) that consists of Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR). (B) In AR and MR, Computer-generated virtual contents are projected to the users’ retina via a transparent screen with a half-silvered mirror enabling a free, unhindered view of the real scene. Therefore, users can concurrently view the real objects and the virtual information overlaid in the immediate physical environment. However, MR allows users to interact with virtual content by hand gestures (*) but not in AR. In VR, virtual contents are projected to the users’ retina via an opaque screen, and a hand-held controller (**) is used to interact with virtual contents. Therefore, users are completely immersed in the virtual environment.
Instead of viewing digital medical information on a 2D computer display, surgeons can use MR technology to visualize virtual digital contents via Head-Mounted Displays (HMDs) in holograms that overlay their immediate physical environment. The surgeon wears the HMD that transmits digital information directly to the surgeon’s retina (Figure 2A–C). Holograms are virtual 3D objects generated by the interference of light beams that reflect real physical objects. Holograms preserve the depth and parallax perception of the original objects that are absent when viewing the same 3D objects on a 2D computer display. Instead of relying on printouts or computer stations to access medical information, users perceived patients’ data as virtual holograms like real objects in the physical world. The intuitive interaction of holographic medical data via the MR HMD facilitates the clinical point-of-care during patients’ assessment and communication. It allows surgeons to stay focused on the surgical field without distraction and constantly shift their visual focus to the imaging monitor.28–31 Surgeons can operate with improved ergonomics compared to the existing computer-assisted navigation procedures.
Figure 2 (A) shows the MR headset with a transparent near-eye-display, HoloLens 2 (Microsoft Corporation, Redmond, WA, USA). The Head-Mounted Display (HMD) has eye-tracking and hand tracking technology, with two cameras (white arrow) per eye. Eye-tracking is a technology that tracks the area the user is focusing on. The gaze can work as an input device like the mouse in a computer. Therefore, the technology allows interaction with holograms via gaze and hand gestures, and the user’s hands can still focus on the task. The HMD also includes an RGB camera (white arrowhead) for user-facing photo capabilities, like taking pictures and video, and live-streaming video of users’ point of view for remote assistance. (B) The author wears HoloLens 2 during the preoperative clinical assessment of a patient with the right calcaneal chondrosarcoma. The holographic contents are overlaid at the surgeon’s immediate real environment and are visualized from the surgeon’s point of view (red arrow). (C) depicts the User Interface of the authors’ developed MR platform. The MR HMD provides real-time, on-demand holographic medical information (CT/ MRI medical images in DICOM/PDF format or 3D bone model) at the clinical point of care while the surgeon can examine the patients with their hands. The system avoids the need for attention shift and eliminates line-of-site disruption, as in computer navigation surgery.
Holographic contents can be patients’ data like text documents, clinical photos, videos, 2D medical images, 3D models generated by image processing, virtual surgical planning, or orthopaedic implants in CAD format. Users can interact with the holograms by a pair of motion controllers or hand gestures, voice commands, or eye gaze (Figure 3A and B, Video S1). Holograms can be enlarged, rotated, or moved to suitable physical spaces to facilitate access to medical information while surgeons perform their tasks. One distinct advantage of MR technology is that the holograms and the primary surgeon’s field of view (FoV) can be real-time shared with other users wearing MR HMDs or computer users via the internet.32 Multiple users can work together more efficiently if they view and directly interact with the same holographic information.
Figure 3 (A) shows the 3D-printed physical model (red arrow) and holographic virtual model in a patient with a pelvic giant cell tumor of bone. When viewing holograms, surgeons use hand gestures or voice commands to call up information instead of touching a keyboard or mouse to keep their hands free for the clinical tasks. Although the 3D-printed physical model gives tactile feedback, the surgeon needs to hold it by hand and without image feedback. Users can also analyze different sections of holographic bone models by enlarging, moving, and rotating the holographic contents by hand gestures. (B) shows the coronal view of the holographic CT bone model in a patient with a giant cell tumor involving the left femoral head and neck. The surgeon can view the different slices of the virtual model’s coronal, sagittal, and axial views by controlling the virtual buttons (red arrows). Video S1.
In complex orthopaedic oncology surgery, surgeons may call up and get advice from seniors or experts in remote locations. The point-of-view live video feed of the operating field provides more and better information than the audio description by the surgeons who traditionally communicate by phone (Figure 4). On the other hand, remote users can bring critical information into the surgeons’ line of sight via an MR HMD for immediate reference, including images, implants operative details, 3D models, or other helpful information. In contrast, surgeons can free up their hands to perform surgery. Therefore, the MR technology may provide on-demand critical information for intraoperative reference and interactive, intuitive remote assistance or discussion. The real-time immersive discussion and sharing of holographic 2D/3D medical information in MR technology may create a new dimension for telemedicine. It is in stark contrast to the traditional telemedicine systems that require separate modules for image processing, audio transmission, and video capture.33 It potentially connects and collaborates with healthcare providers worldwide. Therefore, surgeons may have the ability to go beyond the confines of the computer screen and the real surgical environment for remote connection. It may contribute to an essential element for the future development of the Metaverse in the orthopaedic community, a computer-generated 3D digital world in which surgeons can virtually interact with other people and objects.
Figure 4 Depicts the conceptual design of the proposed function of remote assistance (red arrow) in the User Interface of the authors’ developed MR platform for a patient with left scapular osteosarcoma undergoing tumor resection with the assistance of a 3D-printed cutting guide. The remote surgeon/expert can view the same holographic contents and indirectly the operative field via the RGB camera of the surgeon’s MR Head-Mounted Display (HMD). The remote surgeon can text, annotate the operative photos, or bring critical medical information. Therefore, the function facilitates instant, intuitive visual and audio communication for timely operative decisions or support in contrast to traditional communication.
Clinical Workflow of Mixed Reality Technology
CT and MRI are complementary preoperative images for planning complex orthopaedic bone tumor resection and reconstruction.34 In conventional orthopaedic tumor surgery, surgeons mentally integrate all preoperative 2D images, formulate a 3D surgical plan and then translate the mental surgical plan in the operating room. Like computer-assisted tumor surgery (CATS)21 and 3D printing-assisted surgical procedures,26 mixed reality technology shares the same initial clinical workflow of patient-specific surgical planning that includes image acquisition and image processing to create digital 3D models (Figure 5).
Figure 5 Summarizes the proposed clinical workflow of creating mixed reality applications from medical image acquisition to holographic applications in Orthopaedic Oncology.
High-quality digital medical images are first acquired. A digital medical image is a 2D array of pixels. Modern multirow detector computer tomography (MDCT) and MRI provide accurate image data with high resolution. All acquired medical images are saved in Digital Imaging and Communication in Medicine (DICOM), a standard data format to store, exchange, and transmit medical images. As bone has a high contrast signal in CT images and MDCT can produce axial images with a thin slice thickness of less than 1mm, CT images are ideal for image processing in 3D orthopaedic applications.35 MRI better delineates the soft tissue anatomy like skeletal muscle, neural structures, and medullary or extraosseous extension in orthopaedic bone tumors. However, thin MRI slices are not routinely performed as patients’ movement artifacts compromise the image quality during the long image acquisition time.34 Therefore, 3D image processing is based mainly on CT images with MRI supplementing the soft tissue information.
Computer-aided Design (CAD) software is used for virtual surgical planning (VSP) in patient-specific orthopaedic applications.26 VSP defines surgical problems in 3D and creates a surgical plan that can be precisely replicated in the operating room. DICOM images are reformatted to generate multi-planar images with axial, coronal, and sagittal views for better spatial image interpretation. CT is co-registered with MRI if necessary to provide additional soft tissue information. Image segmentation is performed semi-automatically to generate 3D models from the regions of interest in the image data. Tumor extent is mapped on CT or MRI. The geometry of the 3D model is transformed into a series of smaller components, triangles or polygons, the number of which directly correlates with the model resolution.36 Complicated 3D models of any shape are thus created out of the polygons from 2D digital medical images. The polygons-based CAD models are easier to render and visualize. The CAD models can be edited for surgical simulation, biomechanical analysis, implant, or cutting guide design. They need to be optimized using 3D computer graphics software before creating the holographic applications in a 3D engine.
A 3D engine, also called a game engine, is a technology used for virtual computer simulations and to create the User Interface (UI) of the holographic contents. With the rendering component of the 3D engine that calculates the visual appearance of a scene, computer graphics convert CAD models into 2D images with 3D photorealistic, or as close to reality, effects. This conversion to the Cinematic-Rendered (CR) model is based on random sampling computational algorithms. They use different lightmaps to simulate the actual way that light works to generate a photorealistic depiction of 3D models (Figure 6A–D, Video S2).37 The interaction of the CR models, 2D DICOM images, and patients’ information documents is made by a physics component of the 3D engine. The 3D rendering of the holographic contents is currently performed offline as a pre-rendering process. The 3D engine integrates holograms’ 3D rendering and interactive features before it generates a patient-specific holographic application. It is loaded and run onto the software platform of MR HMDs. The optical see-through HMDs allow surgeons concurrently view the real scene and digital medical contents as holograms with interaction. Optical see-through HMDs were available commercially after 2014 and were reported for potential AR-assisted orthopaedic surgery.38,39 Google Glass (Alphabet Inc., Mountain View, CA, US) was described in AR-assisted pedicle screws insertion.31 The device aroused much attention in 2014, but interest decreased from 2015 to 2017 due to reducing support from Google.39 Moverio BT-200 (Seiko Epson Corp., Nagano, Japan) was released in 2014, and Moverio BT-300, an upgraded version in 2016.39 The HMDs were used to assist free fibular graft harvesting40 and percutaneous endoscopic lumbar discectomy.41 nVisor ST 600 (NVIS Inc., Reston, VA, USA) was investigated for sacroiliac joint screw insertion.42 The two most commonly used MR HMDs are HoloLens 2 (Microsoft Corporation, Redmond, WA, US) and Magic Leap One (Magic Leap Inc., Plantation, Florida, US) for surgical applications (Table 1). The two commercially available MR HMDs are low-end computing devices; they do not support real-time renderings of holographic display contents with very low real-time latency response, most common in video games or interactive graphics.
Table 1 Compares the Commercially Available Mixed Reality Head-Mounted Displays for Surgical Applications
Figure 6 Shows the Computer-Aided Design (CAD) models (A and B) and the Cinematic-Rendered (CR) models (C and D) of a man with the left pelvic giant cell tumor and a lady with the left scapular osteosarcoma, respectively. Computer graphics software generates the CR models according to data dictating the image’s color, material, and texture. It also determines the appropriate light source to simulate the natural way that light works on the polygon-based CAD models. As a result, the CR models give a more photorealistic representation of the 3D CAD models and better identify the pathological anatomy of bone tumors. Before creating the holographic application in the 3D engine, the CR models need optimization by software, like reducing polygon face number and refining the polygons’ quality for better visualization performance in the MR Head-Mounted Display (HMD) with low-end computing processing power. Video S2.
Literature Review and Potential Applications in Orthopaedic Oncology
Clinical studies of using XR technology, either AR or MR, are limited in surgical applications. XR technology has been reported to directly visualize patients’ internal anatomy with overlaid virtual models and real-time position the operative steps during neurosurgical procedures, like skull base surgery,43,44 cerebral aneurysm,45 or vascular malformation surgery46 and guiding ventricular drain insertion.47 The technology has also been applied to oral and maxillofacial surgeries.48 It may help localize the vital nerve structures49 or improve surgical planning and execution in complex facial transplantation.50 A case report of MR-guided liver cancer resection showed that the holographic models provided real-time reference of the critical surgical anatomy and facilitated complex liver tumor resection.51
Reports on XR technology, either AR or MR, using HMDs in orthopaedic surgery are rare as orthopaedic surgeons are still in the early phase of realizing the technology and exploiting its potential. The currently available preclinical studies, case reports, or small case series mainly investigated the feasibility of pedicle screw insertion in spine surgery,31,52–55 guide wires insertion in fracture fixation42,56–58 or placement of implant component in hip resurfacing59 and reverse shoulder arthroplasty.32,60Table 2 summarizes the literature on XR-technology using HMDs in orthopaedic surgical applications. Early evidence suggested that XR-guided pedicle screw insertion was technically feasible in the thoracolumbar spine with acceptable surgical accuracy, similar to the navigation guidance.53,54 It potentially reduces operative time by eliminating attention shifts so that surgeons remain focused on the surgical field.31,54 To date, only one AR spinal navigation platform has been approved for pedicle screw insertion by the United States Food and Drug Administration.54 Studies on XR-guided fracture fixation or arthroplasty remain preliminary. One distinct advantage is that the technology enabled surgeons real-time access to patient’s data and online discussion with remote colleagues via an MR HMD while surgeons’ hands could stay sterile during the operation.32
Table 2 Summarizes Studies Reporting Augmented Reality (AR)- or Mixed Reality (MR)-Guided Orthopaedic Surgery Using Head-Mounted Displays (HMDs)
Only one tumor case has been reported using MR technology to guide bone resection.61 An en-bloc spondylectomy of L1 chordoma was performed using osteotomes under the guidance of holographic 2D axial CT images. The workflow did not involve image processing with tumor segmentation or 3D models generation, similar to computer navigation tumor surgery. MR technology is not yet clinically reported for extremity or pelvic tumors in orthopaedic oncology. However, two preclinical studies using AR-based navigation on a tablet Personal Computer (PC) may give insights into its potential.62,63 Two groups of surgeons performed simulated tumors and resections in pig pelvic or femur models, using either AR-based navigation on a tablet PC or a conventional manual method. The mean deviation error of less than 2mm in the bone resection margin suggested that AR assistance could help achieve the planned margin in a simpler and less costly manner when compared with conventional navigation systems. However, the system only adopted a simple CAD virtual resection template for AR-guided resection instead of medical images of the simulated bone models.
In orthopaedic oncology, the role of emerging MR technology is not defined, and no respective platform is commercially available. Currently, all reported orthopaedic studies tend to investigate the XR technology to work like an image-guided navigation system. However, no studies have systematically explored how the unique MR features, like real-time interaction of holographic 3D models and remote connection with other colleagues, may impact the traditional orthopaedic oncology practice. The potential applications may comprise:
Point of Care Surgical Planning and Procedures
Unlike the fixed directions of movement or rotation on manipulating 3D models with the mouse on a PC, MR technology enhances surgeons’ experience through intuitive interaction with the virtual models by hands. Surgeons can view holographic 3D models in their immediate environment, inside or outside the operating room, for the clinical point of care. Surgeons can access and manipulate virtual models with hands-on sterile gloves as the operation proceeds. Instead of relying on mental pictures projecting onto the patients’ anatomy, the technology facilitates surgical planning during preoperative assessment and intraoperative implementation of tumor patients (Figure 7A–C). The natural MR setup with headsets eliminates surgeons’ attention shifts with improved ergonomics as surgeons do not need to constantly shift their views between the monitor and the operative field as in computer navigation surgery. Also, the handy setup may improve operating room efficiency in contrast to the bulky facilities of the computer navigation system. No additional lead-time is required in preparing the virtual models for surgical planning or procedures compared with the 3D-printed physical models. The advantages may improve surgeons’ access to and adoption of the technology even in less complex tumor cases.
Figure 7 Shows that the MR technology provides instant and on-demand critical medical information at the clinical point of care inside or outside the operating rooms for surgical planning and intraoperative reference. (A) Implant information like screw dimensions for implant fixation in a patient with a pelvic giant cell tumor undergoing tumor resection and 3D-printed custom pelvic prosthetic reconstruction. (B) The representative MRI images in a patient with a solitary T2 bone metastasis undergoing combined anterior and posterior spinal tumor resection and instrumented fixation. (C) The surgical resection planning in PDF file format in a patient with a low-grade bone sarcoma of the left tibia undergoing intercalary tumor resection under the assistance of a 3D-printed resection guide and reconstruction with a vascularized fibular graft transfer.
Better Tumor Localization and Surgical Access
Direct overlay of the holographic models on the patients enables surgeons to directly “see through” the patients’ internal anatomy while surgeons are still in contact with the physical environment. It helps surgeons understand the exact tumor’s spatial localization underneath the skin (Figure 8A and B). Even with unfamiliar anatomy, a more precise skin incision and surgical access to the targeted tumor can be achieved. It may mitigate surgical invasiveness. No invasive tracker to the patient’s bone is needed, as in computer navigation-guided surgery.
Figure 8 (A) The holographic bone model was overlaid on the left hip region in a patient with a femoral head and neck giant cell tumor of bone. Surgeons could directly “see-through” the internal patient’s anatomy, facilitating the precise skin marking for the surgical access (yellow arrows). It did not involve invasive placement of a patient’s bone tracker. (B) shows the intraoperative picture of a patient with the right calcaneal chondrosarcoma. After surgical exposure of the calcaneal tumor, the surgeon overlaid the holographic bone model onto the patient’s actual calcaneus. Together with the reference of the tumor information on the holographic MRI images, the surgeon marked the osteotomy line (yellow arrow) and did the guided-osteotomy to preserve bone insertion of the Achilles tendon for better functional reconstruction.
Real-Time and on-Demand 2D/3D Medical Information
Traditionally, surgeons have little access to patient data or other resources once surgery begins. The MR technology potentially brings critical medical information into surgeons’ points of view in real-time and on-demand (Figure 4). Remote collaborators can provide reference images, 3D models, operative or implant manuals, and other helpful information to the surgeons’ physical space so that surgeons can make a timely decision while working heads-up and hands-free on MR HMDs. It may increase the cleanliness of the operating rooms as traditional access to medical information in printouts, or computer stations is not required. Remote expert surgeons can be contacted online for advice in complex operative situations. It indirectly brings expertise to the operating room while the surgeons giving the advice feel like they were in the operating room. The overall surgical time may be reduced with more efficient procedures and improved workflow in the operating room.31 The Metaverse, a virtual 3D digital operating room, may be a new strategy to facilitate surgeries and improve patient outcomes in the future.
Training and Education
Bone and soft tissue sarcoma can occur in various musculoskeletal regions with different sizes and involvement. Sarcoma surgery to achieve a negative resection margin is challenging when the tumor involves complex anatomical structures or nearby vital neurovascular structures. Due to the rarity of sarcoma, orthopaedic oncology surgeons may not gain vast experience, and the relevant expertise often requires a long period of training. MR technology may help young surgeons understand the tumors’ spatial anatomical structure relations and simulate the planned operations with different surgical approaches and methods before the actual surgeries in the operation rooms.64 It may improve operating room workflow and shorten the training period of orthopaedic tumor surgeons. By combining 3D printed models, MR technology was applied to the simulation training of neurosurgical procedures, lateral ventriculocentesis,65 craniofacial procedures, and endoscopic sinus surgery.66 The surgeons who received MR training models could achieve a higher success rate than those who received conventional training. Telementoring is feasible as instructors can also supervise trainees via this MR training model.67 Therefore, this new MR training approach may allow simulated, task-oriented surgical training in a dedicated virtual environment. At the same time, surgeons still retain the visual and tactile feedback of the surgery with 3D-printed patient models and real-life surgical instruments in the real physical world. It has excellent potential in orthopaedic oncology training.
Limitations and Challenges of MR in Orthopaedic Oncology
Although MR technology offers great promise in Orthopedic Oncology due to its ability at clinical point-of-care with enhanced visualization of medical images, real-time on-demand provision of medical information, and remote assistance, limitations of this emerging technology should be noted (Table 3). As the MR technology is still in the early development stage, we anticipate great room for improvement and its role to be determined in future preclinical and clinical studies. The concerns of the technology include the following.
Table 3 Compares the Mixed Reality Technology with Existing Computer Navigation and 3D Printing in Orthopaedic Oncology
MR HMD has an optimal view distance and angle to view clear holograms. For example, HoloLens 2 (Microsoft Corporation, Redmond, WA, USA) has a view distance of approximately 2m away from the users and a view angle of 52°. This frontal and non-vertical view angle may not be ideal for surgical procedures that require surgeons head-down to view the operative fields. The optimal view distance is also unknown for different procedures in various regions. The holographic application developers may design where surgeons’ eyes converge by placing content and holograms at various depths for different surgical settings.
The bright ambient light of operative lamps reduces the view quality of holograms. An MR HMD with dynamic dimming capability may be developed to work in varied lighting conditions in clinical settings. The low-end computer processors in existing MR HMDs do not support advanced real-time rendering of 3D holograms like in PC. The hardware development may include advances in mobile batteries for prolonged use, simplified volume rendering of 3D models, and developments in HMDs equipment such as cordless HMDs. The biggest problem with 3D holograms is the mismatch between the virtual model and the real world. The current manual registration accuracy of the virtual holograms onto the target actual patients’ anatomy is suboptimal to specific surgical tasks that require extreme precision. Studies are necessary to determine if MR technology can improve the efficiency of the operating rooms and the patient’s surgical outcomes. MR technology has been utilized during the coronavirus disease (COVID-19) pandemic to minimize staff exposure to nosocomial infection and improve access to and quality of patient care.68 A standardized infection control protocol of wearing the MR device with Personal Protection Equipment and device decontamination is mandatory.
The current workflow of MR technology requires various software and processing steps, from the medical image acquisition to the final MR applications. Surgeons, radiologists, and biomedical engineers may not be familiar with the entire process of the new technology. An integrated unified software platform is crucial to seamless collaboration among healthcare providers. A dedicated MR platform with an existing MR HMD is essential to investigate the technology’s potential to address the limitations of using computer navigation or 3D printing in orthopaedic oncology. Also, the platform development should be led by clinical end-users of Orthopaedic Oncology so that the clinical workflow integration can be addressed at the outset. A user-friendly interface with stepwise data input and automated generation of the holographic application is preferable to reduce the lead-time in case preparation. Surgeons and engineers should work together to customize the Holographic User Interface to facilitate surgical applications. Machine learning algorithms to enhance the quality of the raw data of medical images can be explored to further improve the quality of 3D models creation.69 As sensitive patient data may be shared electronically for patients’ clinical care, appropriate information security and governance should be deployed according to the Information Technology policy of the individual hospital of the country.68 To start new technologies, high costs and time-consuming staff training can be problems.
Users may experience discomfort when using MR technology with HMDs. It may include temporary feelings of nausea, motion sickness, dizziness, disorientation, headache, fatigue, eye strain, or dry eyes. In a study evaluating the simulation sickness of 142 subjects who performed specific tasks with Microsoft HoloLens in three industries (aviation, medical, and space), most users faced no symptoms while only a few experienced minimal discomforts.70 With the advent of new MR HMDs, users’ discomfort may be further reduced. For introducing any new technology, the steep learning curve is a concern. Also, inexperienced users may be unable to cope with system failures during surgery. It has been shown that tumor surgeons could achieve less computer navigation procedure time after gaining more experience in the procedures.22 Therefore, the initial learning curve may not hinder the MR technology’s popularity in orthopaedic oncology.
As MR technology is still in its early stage, the data on users’ workloads due to the learning curve and the extent of discomfort caused by wearing the HMDs for extended periods is limited. The Surgical Task Load Index to ascertain the workload associated with MR technology has to be investigated. It can be evaluated with a validated questionnaire, such as the System Usability score or the National Aeronautics and Space Administration Task Load Index (NASA-TLX).
Preclinical and clinical studies about MR applications are limited to support its routine use. It remains to be determined if the potential benefits of MR technology can translate into better clinical results in orthopaedic oncology. The initial workflow of image acquisition, processing, and virtual CAD surgical simulation in MR technology is essentially the same as in computer navigation21 and 3D printing.26 Studies may focus on the unique features of MR technology. It may include the use of cinematic-rendered versus CAD models in virtual surgical planning; virtual navigation planning on PC or surgical planning with 3D-printed models versus MR planning with HMDs; and the role of clinical point-of-care with remote assistance. The current technology does not support MR-guided bone resection with equivalent bone target accuracy as computer navigation and 3D printing. Accurate holograms-to-patient registration and tools tracking in the MR setting should be developed and tested in preclinical studies. Before being widely adaptable in orthopedic oncology, careful clinical research of MR technology is required to prove its safety, clinical effectiveness, and cost-effectiveness. MR technology may even integrate computer navigation and 3D printing surgery into the existing clinical workflow for orthopaedic tumors with varying complexity.
The emerging MR technology is disruptive to traditional visualizing and real-time sharing of digital medical information in surgical practice. It adds a new dimension to digital assistive tools with a more accessible and less costly alternative in orthopaedic oncology that may address the limitations of the existing computer navigation and 3D printing techniques. The unique MR features of enhanced medical images visualization and interaction with holograms allow surgeons real-time and on-demand medical information and remote assistance in their immediate working environment. The MR HMD and hand-free control may achieve clinical point-of-care inside or outside the operating room and potentially improve service efficiency and patients’ safety. MR and other computer-assisted technology are complementary and not mutually exclusive for the same surgical goals. Therefore, utilization and integration of MR technology into the existing computer-assisted workflow may be a critical component for building the next generation of orthopaedic oncology tools. However, the absence of an accurate hologram-to-patient registration method, an MR platform dedicated to orthopaedic oncology, and a lack of clinical results may hinder the wide adoption of the technology. Industry-academic partnerships are essential to advance the technology with its clinical role determined through future clinical studies.
We thank the biomedical engineers, Mr. Christopher Bünning and Mr. Mirko Steffen (implantcast GmbH, Buxtehude, Germany) for designing the 3D-printed patient-specific resection guides and implants in two patients in this article.
This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.
The authors have no conflicts of interest relevant to this article.
1. Fuchs B, Hoekzema N, Larson DR, Inwards CY, Sim FH. Osteosarcoma of the pelvis: outcome analysis of surgical treatment. Clin Orthop Relat Res. 2009;467:510–518. doi:10.1007/s11999-008-0495-x
2. Bertrand TE, Cruz A, Binitie O, Cheong D, Letson GD. Do surgical margins affect local recurrence and survival in extremity, nonmetastatic, high-grade osteosarcoma? Clin Orthop Relat Res. 2016;474(3):677–683. doi:10.1007/s11999-015-4359-x
3. He F, Zhang W, Shen Y, et al. Effects of resection margins on local recurrence of osteosarcoma in extremity and pelvis: systematic review and meta-analysis. Int J Surg. 2016;36(Pt A):283–292. doi:10.1016/j.ijsu.2016.11.016
4. Cho HS, Oh JH, Han I, Kim HS. The outcomes of navigation-assisted bone tumour surgery: minimum three-year follow-up. JBJS. 2012;94B(10):1414–1420.
5. Wong KC, Kumta SM. Computer-assisted tumor surgery in malignant bone tumors. Clin Orthop Relat Res. 2013;471(3):750–761. doi:10.1007/s11999-012-2557-3
6. Jeys L, Matharu GS, Nandra RS, Grimer RJ. Can computer navigation-assisted surgery reduce the risk of an intralesional margin and reduce the rate of local recurrence in patients with a tumour of the pelvis or sacrum? Bone Joint Lett J. 2013;95-B(10):1417–1424. doi:10.1302/0301-620X.95B10.31734
7. Wong KC, Kumta SM. Joint-preserving tumor resection and reconstruction using image-guided computer navigation. Clin Orthop Relat Res. 2013;471(3):762–773. doi:10.1007/s11999-012-2536-8
8. Li J, Wang Z, Guo Z, Chen GJ, Yang M, Pei GX. Precise resection and biological reconstruction under navigation guidance for young patients with juxta-articular bone sarcoma in lower extremity: preliminary report. J Pediatr Orthop. 2014;34(1):101–108. doi:10.1097/BPO.0b013e31829b2f23
9. Gerbers JG, Stevens M, Ploegmakers JJW, Bulstra SK, Jutte PC. Computer-assisted surgery in orthopedic oncology: technique, indications, and a descriptive study of 130 cases. Acta Orthop. 2014;85(6):663–669. doi:10.3109/17453674.2014.950800
10. Abraham JA, Kenneally B, Amer KBS, Geller DS. Can navigation-assisted surgery help achieve negative margins in resection of pelvic and sacral tumors? Clin Orthop Relat Res. 2018;476(3):499–508. doi:10.1007/s11999.0000000000000064
11. Bosma SE, Cleven AHG, Dijkstra PDS. Can navigation improve the ability to achieve tumor-free margins in pelvic and sacral primary bone sarcoma resections? A historically controlled study. Clin Orthop Relat Res. 2019;477(7):1548–1559. doi:10.1097/CORR.0000000000000766
12. Wong KC, Kumta SM, Sze KY, Wong CM. Use of a patient specific CAD/CAM surgical jig in extremity bone tumor resection and custom prosthetic reconstruction. Comput Aided Surg. 2012;17(6):284–293. doi:10.3109/10929088.2012.725771
13. Gouin F, Paul L, Odri GA, Cartiaux O. Computer-assisted planning and patient specific instruments for bone tumor resection within the pelvis: a series of 11 patients. Sarcoma. 2014;842709. doi:10.1155/2014/842709
14. Jentzsch T, Vlachopoulos L, Fürnstahl P, Müller DA, Fuchs B. Tumor resection at the pelvis using three-dimensional planning and patient-specific instruments: a case series. World J Surg Oncol. 2016;14(1):249. doi:10.1186/s12957-016-1006-2
15. Evrard R, Schubert T, Paul L, Docquier PL. Resection margins obtained with patient specific instruments for resecting primary pelvic bone sarcomas: a case-control study. Orthop Traumatol Surg Res. 2019;105(4):781–787. doi:10.1016/j.otsr.2018.12.016
16. Jud L, Müller DA, Fürnstahl P, Fucentese SF, Vlachopoulos L. Joint-preserving tumour resection around the knee with allograft reconstruction using three dimensional preoperative planning and patient-specific instruments. Knee. 2019;26(3):787–793. doi:10.1016/j.knee.2019.02.015
17. Fujiwara T, Kaneuchi Y, Stevenson J, et al. Navigation-assisted pelvic resections and reconstructions for periacetabular chondrosarcomas. Eur J Surg Oncol. 2021;47(2):416–423. doi:10.1016/j.ejso.2020.05.025
18. Fujiwara T, Sree DV, Stevenson J, et al. Acetabular reconstruction with an ice-cream cone prosthesis following resection of pelvic tumors: does computer navigation improve surgical outcome? J Surg Oncol. 2020;121(7):1104–1114. doi:10.1002/jso.25882
19. Wong KC. Computer-assisted musculoskeletal surgery: thinking and executing in 3D. In: Chapter 6, Introduction to Surgical Navigation. Springer; 2016:59–70. doi:10.1007/978-3-319-12943-3_6
20. Léger É, Drouin S, Collins DL, Popa T, Kersten-Oertel M. Quantifying attention shifts in augmented reality image-guided neurosurgery. Healthc Technol Lett. 2017;4(5):188–192. doi:10.1049/htl.2017.0062
21. Wong KC, Kumta SM. Use of computer navigation in orthopedic oncology. Curr Surg Rep. 2014;2:47. doi:10.1007/s40137-014-0047-0
22. Farfalli GL, Albergo JI, Ritacco LE, et al. What is the expected learning curve in computer-assisted navigation for bone tumor resection? Clin Orthop Relat Res. 2017;475:668–675. doi:10.1007/s11999-016-4761-z
23. Wong KC, Sze KY, Wong IO, Wong CM, Kumta SM. Patient-specific instrument can achieve same accuracy with less resection time than navigation assistance in periacetabular pelvic tumor surgery: a cadaveric study. Int J Comput Assist Radiol Surg. 2016;11(2):307–316. doi:10.1007/s11548-015-1250-x
24. Bosma SE, Wong KC, Paul L, Gerbers JG, Jutte PC. A cadaveric comparative study on the surgical accuracy of freehand, computer navigation, and patient-specific instruments in joint-preserving bone tumor resections. Sarcoma. 2018;2018:4065846. doi:10.1155/2018/4065846
25. McCulloch RA, Frisoni T, Kurunskal V, Maria Donati D, Jeys L. Computer navigation and 3D printing in the surgical management of bone sarcoma. Cells. 2021;10(2):195. doi:10.3390/cells10020195
26. Wong KC. 3D-printed patient-specific applications in orthopedics. Orthop Res Rev. 2016;8:57–66. doi:10.2147/ORR.S99614
27. Milgram P, Kishino F. A taxonomy of mixed reality visual displays. IEICE Trans Inform Syst. 1994;E77-D(12):1321–1329.
28. Sielhorst T, Feuerstein M, Navab N. Advanced medical displays: a literature review of augmented Reality. J Display Technol. 2008;4(4):451–467. doi:10.1109/JDT.2008.2001575
29. Eckardt C, Paulo EB. Heads-up surgery for vitreoretinal procedures: an experimental and clinical study. Retina. 2016;36(1):137–147. doi:10.1097/IAE.0000000000000689
30. Chang JYC, Tsui LY, Yeung KSK, Yip SWY, Leung GKK. Surgical vision: google glass and surgery. Surgical Innov. 2016;23(4):422–426. doi:10.1177/1553350616646477
31. Yoon JW, Chen RE, Han PK, Si P, Freeman WD, Pirris SM. Technical feasibility and safety of an intraoperative head-up display device during spine instrumentation. Int J Med Robot Comput Assist Surg. 2016;13(3):e1770. doi:10.1002/rcs.1770
32. Gregory TM, Gregory J, Sledge J, Allard R, Mir O. Surgery guided by mixed reality: presentation of a proof of concept. Acta Orthop. 2018;89(5):480–483. doi:10.1080/17453674.2018.1506974
33. Lu L, Wang H, Liu P, et al. Applications of mixed reality technology in orthopedics surgery: a pilot study. Front Bioeng Biotechnol. 2022;10:740507. doi:10.3389/fbioe.2022.740507
34. Wong KC, Kumta SM, Antonio GE, Tse LF. Image fusion for computer-assisted bone tumor surgery. Clin Orthop Relat Res. 2008;466(10):2533–2541. doi:10.1007/s11999-008-0374-5
35. Dalrymple NC, Prasad SR, Freckleton MW, Chintapalli KN. Informatics in radiology (info-RAD): introduction to the language of three-dimensional imaging with multidetector CT. Radio-Graphics. 2005;25:1409–1428.
36. Mangrulkar A, Rane S, Sunnapwar V. Image-based bio-cad modeling: overview, scope, and challenges. J Phys Conf Ser. 2020;1706:1706 012189.
37. Dappa E, Higashigaito K, Fornaro J, Leschka S, Wildermuth S, Alkadhi H. Cinematic rendering – an alternative to volume rendering for 3D computed tomography imaging. Insights Imaging. 2016;7:849–856. doi:10.1007/s13244-016-0518-1
38. Morimoto T, Kobayashi T, Hirata H, et al. XR (Extended Reality: virtual Reality, Augmented Reality, Mixed Reality) technology in spine medicine: status quo and quo vadis. J Clin Med. 2022;11(2):470. doi:10.3390/jcm11020470
39. Birlo M, Edwards PJE, Clarkson M, Stoyanov D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: a systematic review. Med Image Anal. 2022;77:102361. doi:10.1016/j.media.2022.102361
40. Pietruski P, Majak M, Świątek-najwer E, et al. Supporting fibula free flap harvest with augmented reality: a proof-of-concept study. Laryngoscope. 2020;130(5):1173–1179. doi:10.1002/lary.28090
41. Liounakos JI, Urakov T, Wang MY. Head-up display assisted endoscopic lumbar discectomy – a technical note. Int J Medl Robotics Comput Assist Surg. 2020;16(3):e2089. doi:10.1002/rcs.2089
42. Wang H, Wang F, Leong AP, Xu L, Chen X, Wang Q. Precision insertion of percutaneous sacroiliac screws using a novel augmented reality-based navigation system: a pilot study. Int Orthop. 2016;40(9):1941–1947. doi:10.1007/s00264-015-3028-8
43. Cabrilo I, Sarrafzadeh A, Bijlenga P, et al. Augmented reality-assisted skull base surgery. Neuro-Chirurgie. 2014;60:304–306. doi:10.1016/j.neuchi.2014.07.001
44. Ling X, Yan S, Liang C, et al. Application of mixed reality technology in the resection of benign lateral skull base tumors. J Shandong Univ. 2020;58:37–44.
45. Cabrilo I, Bijlenga P, Schaller K. Augmented reality in the surgery of cerebral aneurysms: a technical report. Neurosurgery. 2014;10(Suppl 2):252–261. doi:10.1227/NEU.0000000000000328
46. Cabrilo I, Bijlenga P, Schaller K. Augmented reality in the surgery of cerebral arteriovenous malformations: technique assessment and considerations. Acta Neurochir. 2014;156:1769–1774. doi:10.1007/s00701-014-2183-9
47. Li Y, Chen X, Wang N, et al. A wearable mixed-reality holographic computer for guiding external ventricular drain insertion at the bedside. J Neurosurg. 2018;1:1–8.
48. Bose R, Fitoussi A, Hersant B, et al. Intraoperative augmented reality with heads-up displays in maxillofacial surgery: a systemic review of the literature and a classification of relevant technologies. Int J Oral Maxillofac Surg. 2019;48(1):132–139. doi:10.1016/j.ijom.2018.09.010
49. Zhu M, Liu F, Chai G, et al. A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery. Sci Rep. 2017;7:42365. doi:10.1038/srep42365
50. Cho KH, Papay FA, Yanof J, et al. Mixed reality and 3D printed models for planning and execution of face transplantation. Ann Surg. 2021;274(6):e1238–e1246. doi:10.1097/SLA.0000000000003794
51. Saito Y, Sugimoto M, Imura A, et al. Intraoperative 3D hologram support with mixed reality techniques in liver surgery. J Ann Surg. 2020;271(1):e4–e7. doi:10.1097/SLA.0000000000003552
52. Abe Y, Sato S, Kato K, et al. A novel 3D guidance system using augmented reality for percutaneous vertebroplasty: technical note. J Neurosurg Spine. 2013;19(4):492–501. doi:10.3171/2013.7.SPINE12917
53. Molina CA, Phillips FM, Colman MW, et al. A cadaveric precision and accuracy analysis of augmented reality-mediated percutaneous pedicle implant insertion. J Neurosurg Spine. 2020;2020:1–9.
54. Molina CA, Sciubba DM, Greenberg JK, Khan M, Witham T. Clinical accuracy, technical precision, and workflow of the first in human use of an augmented-reality head-mounted display stereotactic navigation system for spine surgery. Oper Neurosurg. 2021;20(3):300–309. doi:10.1093/ons/opaa398
55. Li J, Zhang H, Li Q, et al. Treating lumbar fracture using the mixed reality technique. Biomed Res Int. 2021;2021:6620746.
56. Hiranaka T, Fujishiro T, Hida Y, et al. Augmented reality: the use of the PicoLinker smart glasses improves wire insertion under fluoroscopy. World J Orthop. 2017;8(12):891–894. doi:10.5312/wjo.v8.i12.891
57. Laguna B, Livingston K, Brar R, et al. Assessing the value of a novel augmented reality application for presurgical planning in adolescent elbow fractures. Front Virtual Real. 2020;1:19. doi:10.3389/frvir.2020.528810
58. Gregory T, Hurst SA, Moslemi A. Mixed reality assisted percutaneous scaphoid fixation. Tech Hand Up Extrem Surg. 2021;26(1):32–36. doi:10.1097/BTH.0000000000000353
59. Liu H, Auvinet E, Giles J, Rodriguez Y, Baena F. Augmented reality based navigation for computer assisted hip resurfacing: a proof of concept study. Ann Biomed Eng. 2018;46(10):1595–1605. doi:10.1007/s10439-018-2055-1
60. Kriechling P, Roner S, Liebmann F, et al. Augmented reality for base plate component placement in reverse total shoulder arthroplasty: a feasibility study. Arch Orthop Trauma Surg. 2021;141:1447–1453. doi:10.1007/s00402-020-03542-z
61. Molina CA, Dibble CF, Lo SL, Witham T, Sciubba DM. Augmented reality-mediated stereotactic navigation for execution of en bloc lumbar spondylectomy osteotomies. J Neurosurg Spine. 2021;2021:1–6.
62. Cho HS, Park YK, Gupta S, et al. Augmented reality in bone tumour resection: an experimental study. Bone Joint Res. 2017;6(3):137–143. doi:10.1302/2046-3758.63.BJR-2016-0289.R1
63. Cho HS, Park MS, Gupta S, et al. Can augmented reality be helpful in pelvic bone cancer surgery? An in vitro study. Clin Orthop Relat Res. 2018;476(9):1719–1725. doi:10.1007/s11999.0000000000000233
64. Marcus HJ, Pratt P, Hughes-Hallett A, et al. Comparative effectiveness and safety of image guidance systems in surgery: a preclinical randomised study. Lancet. 2015;385(Suppl 1):S64. doi:10.1016/S0140-6736(15)60379-8
65. Chen ZY, Liu YQ, He BW, et al. Application of ventricle puncture training system based on mixed Reality in medical education and training. Electron J Trauma Emerg. 2019;7:5–10.
66. Barber SR, Jain S, Son YJ, Chang EH. Virtual functional endoscopic sinus surgery simulation with 3d-printed models for mixed-reality nasal endoscopy. Otolaryngol Head Neck Surg. 2018;159:933–937. doi:10.1177/0194599818797586
67. Mitsuno D, Hirota Y, Akamatsu J, et al. Telementoring demonstration in craniofacial surgery with HoleLens, Skype, and three-layer facial models. J Craniofac Surg. 2019;30(1):28–32. doi:10.1097/SCS.0000000000004899
68. Martin G, Koizia L, Kooner A, et al. PanSurg collaborative use of the HoloLens2 mixed reality headset for protecting health care workers during the COVID-19 pandemic: prospective, observational evaluation. J Med Internet Res. 2020;22(8):e21486. doi:10.2196/21486
69. Li S, Qian P, Zhang X, Chen A. Research on image denoising and super-resolution reconstruction technology of multiscale-fusion images. Mobile Inform Syst. 2021. doi:10.1155/2021/5184688
70. Vovk A, Wild F, Guest W, Kuula T. Simulator sickness in augmented reality training using the microsoft HoloLens.