Sony Releases Head-Mounted Display for Endoscopic Applications

Minimally invasive procedures have shown many advantages over open surgeries in the past. The small incisions for the insertion of instrumentation to access the surgical site cause fewer lesions and therefore less pain for the patient. Consequentially, this has the effect of shorter recovery times and better cosmetic results. In addition to that, in many cases the surgeon gets better direct views of deep seated surgical sites, i.e. views are less obstructed by non-relevant tissue and blood than in a conventional, open surgery.

However, methodology of workspace setup and instrumentation in endoscopic surgeries has strongly been adopted from open surgeries, which can result in ergonomic problems for the surgeon. For instance, Park et al. (Patients benefit while surgeons suffer: An impending epidemic. Journal of the American College of Surgeons, 2009) reports that 86.9 % of surgeons performing laparoscopy on a daily bases report physical symptoms and discomfort, which are caused by unsophisticated interface design of instrumentation, non-beneficial display position and technology. Sony is one of the technology leaders now attacking this problem with a new product.

Recently, Sony Corporation announced to release a “Head-Mounted Display for Endoscopic Surgery” by first of August 2013.  Currently, the device is only available on the Japanese market. It is supposed to serve as the main display to present endoscopic images generated in minimally invasive surgical procedures. Such endoscopic video data is shown on two OLED (organic light-emitting diode) panels (1280 x 720), one for each eye of the wearer of the head-mounted monitor. According to Sony’s press release, the usage of OLED panels allows for “superb reproduction of blacks, excellent video image response times, and precise color reproduction”.

The existence of the two panels installed in the head-mounted monitor enables to project also images from stereoscopic endoscope cameras into the optical path of the surgeon to enhance 3D perception of the anatomical situation at the operative site. Currently, two head-mounted monitors can be used simultaneously to support collaborative work.  The head worn device allows for a much better ergonomic working position. The surgeon stands upright and aligns his head to the most comfortable direction while keeping the most important information source, the endoscopic images, right in front if his eyes.

While the head-mounted monitor looks not too heavy (“Approx. 490g”), not too bulky (Approx. 191mm x 173mm x 271mm), and comfortable enough to be worn for a longer time during surgery, the device still seems to block unfavorably the surgeon’s visual perception of his/her physical environment. Considering the problem, that the surgeon cannot simply remove the device with his hands in order to avoid the contamination around the operating table, there might be space for improving the integrative concept of the device to become a mobile and flexible equipment of the OR infrastructure. However, it might be the case that the current press release simply does not address this issue but a solution for “the exit strategy”, i.e. quickly removing the device from the working arena and the surgeons head for instance in emergency cases, is actually already part of the product. Also related to this, Sony mentions that surgeons get a small but limited direct view of the patient thanks to a gap at the bottom of the head-mounted monitor. This view might be enough to interact with surgical assistants and insert instruments through ports into the patient.

According to Sony the price of a package combining one head-mounted image processing unit feeding one head-mounted monitor with endoscopic images will be around JPY 1500000 (~11500 Euro).

The device is not yet capable of presenting the endoscopic video data in combination with virtual 3D information that is registered with the 3D anatomy, such as data from preoperative planning or 3D imaging data. For this reason, it is not an Augmented Reality device, at least not yet. A ‘Picture in Picture’ mode is already available that allows for displaying a secondary 2D image superimposed onto the real time imaging data captured by the endoscope camera. The step to use real time image analysis or external tracking data, e.g. to process a depth map of the scene recorded by the stereo endoscopes or/and project planning data or imaging data such as US, MRI or CT into the image is not too big. We are looking forward to the next releases of this product that might come with Augmented Reality features.