Sensing 3D geometry of a minimally invasive operation site through an endoscope camera is not an easy job. Simply using color information from the camera in combination with automatic or semi-automatic identification and tracking of natural or artificial landmarks inside the body is quite difficult and error prone.
Sven Haase of the Pattern Recognition Lab, Department of Computer Science, Friedrich-Alexander-Universität Erlangen-Nürnberg recently presented a time efficient way to prepare sensor fusion of a Time-of-Flight (ToF) and an RGB sensor at BVM 2012 – Bildverarbeitung für die Medizin.
In addition to color information, ToF cameras are able to measure depth information from a scene. For this reason, the ToF camera (or additional software) provides for every pixel beside its grey value also a radial distance value describing the surfaces being viewed by the camera.
According to Sven Haase, „the system is still in a prototypical stage and subject of fundamental research“. However, researchers plan to equip endoscopes with this system to enhance visual information for improved intra-operative decision making. Thanks to ToF cameras, the surgeon would be able to access in addition to a simple color projection of the camera image also views of the 3D topology of the operation site. The resulting 3D elevation profile can then be combined with color information from the endoscope camera and viewed from different perspectives. A screenshot of an experimental scene, imitating an operation site, is shown in the following picture. Here, color image information is acquired by the endoscope camera. The image on the bottom-right shows the result of Sven Haases’s system having processed the raw data of the ToF camera in order to remove noise and improve image quality.