Augmenting the views of endoscopic cameras with virtual information has the big advantage of starting the development of a surgical Augmented Reality system based on medical devices that are already certified and used on a daily bases in today’s operating theatres as standard tools in many minimally invasive applications.
The most important remaining tasks for developers of augmented endoscopes then address the integration of a tracking system that locates accurately and robustly the endoscope, the patient and, optionally, additional minimally invasive instruments within the operating area, the calibration of the intrinsic parameters of the endoscope camera optics to correctly position virtual objects into the real world captured by the camera (usually a onetime task), and the design of the augmented scene to be useful for the designated surgical application.
Scopis GmbH is a spin-off company from a research cooperation of Charité Berlin and Fraunhofer IPK, which is hosted by the Berliner Zentrum für Mechatronische Medizintechnik (BZMM). The main product of Scopis GmbH is an endoscopic navigation system using Augmented Reality as a core technology. According to Dr. Christopher Özbek (CTO Software), the Scopis® Navigation system has succeeded to solve one of the most cumbersome issues of surgical navigation systems, which is the usually time consuming, intra operative calibration. The calibration guarantees accurate tracking of the patient and the instrumentation, and as a consequence, accurate registration of the visualization of imaging data and navigational information with the real anatomy.
Dr. Christopher Özbek emphasizes: “In contrast to our competitors, with our system we are able to calibrate the system within only 20 seconds to take advantage of Augmented Reality visualization.” Dr. Özbek further explains, “the most striking change for our Scopis Navigation System in contrast to classic navigation is the large increase in navigation availability. In contrast to pointer based conventional navigated surgery with an average of 48 pointer uses in a two hour surgery (totaling on average less than 10 minutes of navigation in a two hour surgery), we see increased navigation availability using our endoscopic tracking and augmented reality technology. In typical FESS surgery (Functional Endoscopic Sinus Surgery), availability of the endoscopic navigation was always above 74% of surgery time for a study on 12 cases. Pointer use drops to, on average, 15 uses, saving time needed to perform instrument changes.
One exemplary application of their navigation system Scopis® Navigationis the marking and visualization of anatomical spots in the endoscopic view.
These spots can be defined in a planning phase to be used then intraoperatively to help the surgeon targeting the operating site such as the tumor, entry points to cells or cavities. In addition to that, it supports the surgeon navigating around risky structures such as arteries or nerves.
In addition to these virtual landmarks, the system allows to superimpose a virtual guidance aid onto the endoscopic video images to move the instrumentation along the planned trajectory to the operating site. Here, the trajectory is visualized as wireframe cylinder around the trajectory; while color coded rings of this cylinder and a displayed number provide information about the distance of the instrument to the operating site.
Dr. Christopher Özbek points out two major advantages of using Augmented Reality in surgical applications: “First, the presentation of depth and position of objects is much more intuitive than in standard axial slices views. Second, in the augmented endoscope provides navigational information for 90% of the operation time in contrast to classic navigation systems, which show such information only 5 – 20% of the time.“
One of the most important tasks in the development of surgical navigation systems is the design of a user interface that allows controlling the navigation system during surgery. Here, one needs to consider keeping equipment and surgical staff sterile in the area around the patient. Many navigation systems require the incorporation of a trained assistant to control the system e.g. via classic mouse and keyboard interactions. This may involve additional, possibly misleading, communication between the surgeon and the assistant causing time delays due to possible misinterpretation (and maybe even stress and frustration). At best, the surgeon has the option to control the system autonomously.
Dr. Christopher Özbek describes their approach of interaction with an exemplary workflow using Scopis® Navigation: “Most of the user interaction with the Scopis® Navigation system is performed using gesture commands with the sterile navigated instruments. For instance, if the surgeon wants to verify the accuracy of a navigated instrument, the surgeon would put the tip of the instrument into a calibration cone of the sterile patient head tracker. The system will detect the position of the instrument, verify the accuracy of the instrument and open a wizard to show the measured results. If the surgeon removes the tool, the dialog closes. No handling of the non-sterile monitor or base navigation system is required. For interactions with the systems which are not covered by gesture commands, the surgeon must either ask a nurse or assistant or use a sterile instrument to operate the touch screen.”
Up to now more than 30 Scopis® Navigation systems have been sold and are mostly used in ENT (ear, nose, and throat) surgery. Applications in CMF (cranio maxillofacial) surgery and neurosurgery are becoming more and more interesting for the users of the system. There are still no measurements of patient outcome available, yet. However, Dr. Christopher Özbek expects “better or in class results in comparison to other navigation systems”.