During the demo session at ISMAR 2011 in Basel, a Medical Augmented Reality system has been presented by developers of Space Applications Services N.V./S.A.. The objective of the project Computer Aided Medical Diagnostics and Surgery System (CAMDASS) is to develop ultrasound augmentation to be used by astronauts on long-term spaceflights. We are happy to present an interview with Dr. Keshav Chintamani who is a member of the Space Application Services project team with Tangi Meyer, Yashodhan Nevatia and Michel Ilzkovitz. The ESA project officer is Arnaud Runge. The interview was held on December 12, 2011.
Dr. Keshav Chintamani obtained his PhD from Wayne State University, Detroit MI in 2010. Keshav has a background in robotics, human factors/ergonomics, user-interface design (VR/AR) and systems engineering. He has participated in robotic and Augmented Reality (AR) technologies projects in the USA as well as in astronaut support technology projects such as CAMDASS and FITS for ESA. He is also experienced in human-computer and human-robot interaction design for space systems including robotic control interfaces and wearable AR/VR interfaces for medical applications. He participated as a lead developer on a NASA project to develop AR/VR interfaces for International Space Station (ISS) robotic operations and conducted extensive human factors evaluations in the project. In his previous work, he participated in surgical robotics projects involving development of interface technologies for laparoscopic surgeries.
How did you get in touch with Augmented Reality?
During my dissertation, my professor was running a NASA project for Augmented Reality supported space station manipulators. The project developed Augmented Reality interfaces to better perform remote control of robotic arms. Here, you can see the robot arm only through cameras and you run into a classical problem that is also known in laparoscopy when you have to deal with visual–motor misalignments. So, I got interested in the Augmented Reality/human factors aspect at that point. Later, when I was looking for a job, Space Applications Services N.V./S.A. hired me because they were running a number of Augmented Reality projects, which had a very similar theme as my previous work.
How is your team organized?
Space Applications Services N.V./S.A. is a SME (Small and medium-sized enterprise) in Belgium working on a range of technologies. Our main customers are from the space industry and we mainly work with ESA. We also work on European Commission projects and with other clients such as Astrium and Thales. The company is basically divided into five main groups. I’m part of the “Systems & Ground Segment” group. Under this group, we also have a robotic and a Virtual Reality/Augmented Reality division. Because of my background, I work with these two divisions. The Systems and Ground Segment team currently has 15 members. In addition, we have a software group, a knowledge management group, a space operations group and an avionics systems group.
When did the CAMDASS project start?
The term CAMDASS project stands for Computer Aided Medical Diagnostics and Surgery System. In 2008/2009, ESA was very interested in using Augmented Reality technologies for space applications that can provide support in medical issues on the International Space Station. The project started in 2009/2010 and we began to develop the ground prototype.
Why do space stations such as ISS need to be equipped with high-tech medical devices?
We need to take a look at the background of space missions in order to better understand the objectives of the project. Today, astronauts on the ISS spend extended periods of time in space. The duration of the astronauts’ stay in space has increased and one working period on ISS can take up to six months. There is usually no medical expert on board, but there is the option to contact ground control to get support from a medical expert in case of emergencies. Astronauts spend a lot of time on microgravity, which has some side effects such as muscle mass degradation and reduction of bone density. So, there are two major aspects. The first is that sometimes no medical crew officer is available to treat injuries or diseases of crewmembers and second, there exists the need for physiological measurements in space to explore how humans perform in microgravity.
Why is there no astronaut on board, e.g. on ISS, having a medical background?
Well, crewmembers get training to do basic medical procedures but you cannot expect them to be skilled to treat a wide range of medical conditions, for example, internal bleeding or laparoscopic surgery. They usually have other priorities such as experiments to run and many other flight related tasks.
Why did you choose ultrasound as the imaging modality for CAMDASS?
The ISS is already equipped with an ultrasound device. Ultrasound is an imaging modality allowing quick and low-cost diagnostic procedures that can be space qualified. For most of the other imaging modalities such as CT, MRI or standard x-ray, space qualification is a major challenge. On a space station, ultrasound can be used first as a diagnostic tool but also as a measurement device, as I mentioned before.
What are typical emergency situations on a space station when the CAMDASS system will be used?
Trauma is one of the first applications for the system on the space station. Astronauts have to deal with heavy objects. In space, there might be microgravity that allows you to push a heavy object with your hand. But that object still has momentum. So, the object can still hit you and easily break your hand.
Why did you decide to enhance such medical devices in space working environments with Augmented Reality technology?
Augmented Reality techniques have progressed rapidly in the last few years, and ESA was very interested in addressing some of the challenges, identifying the needs for such systems and developing some prototypes. This is how we got involved with our Augmented Reality expertise. There were a couple Augmented Reality systems that were developed in parallel by us, such as, the WEAR project. WEAR was an Augmented Reality system tested in 2009 by the Belgium astronaut Frank De Winne on the ISS. The WEAR system was targeted for inspection tasks, e.g. maintenance procedures. Here, the idea was to provide a system that allowed users to perform autonomously and independently from experts at the ground control. The ISS is a nice test bed to start working on such platforms and to start preparing the system for long duration space missions. Such missions require a more autonomous working system since contact with medical experts at ground control can suffer due to communication time-delays. This is where Augmented Reality plays a major role in the CAMDASS system.
What does the user see in terms of Augmentation when looking through the head mounted display (HMD)?
The system shows simple but informative 3D graphics. With respect to muscle measurement when assessing muscle degeneration in microgravity, the CAMDASS system helps the user guide the ultrasound probe to certain locations on the patient’s body to acquire ultrasound images. For this reason, the system uses information from an internal database hosting reference images and positions to show the user how the image should look like and where it should be taken. There are different intuitive, virtual cues that will support the user to guide the probe to the correct location. The user will always get visual feedback on what he is doing to help him when positioning the probe. Please also check our posted videos to get a better picture of the Augmented Reality scene shown to the user.
Please tell us more about your hands-free interaction approach in the system!
You don’t want the user to interact with secondary devices such as a mouse or keyboard. You want to keep the user as functional as possible. The user has to deal with wearing the head mounted display (HMD), manipulating the ultrasound device and follow additional procedures in order to make the system work, such as calibrating the patient. All this can increase workload. We tried to keep usability very high and this is why we chose voice as the main interaction interface.
Did you evaluate the system with a user study?
Yes, we did a high-level usability evaluation. Here, we got some interesting results. We had participants who never used an ultrasound device or a head mounted display (HMD) before, but they got used to the system after a very short time, i.e. in about 15-20 minutes. After this, they were able to perform such procedures by themselves. Following this, it should not be difficult for astronauts to work with a similar technology in a very autonomous way on long-term space journeys.
What are the next steps and where do you need support?
One problem we faced during the project was related to the integration of an ultrasound system. It was very difficult to find manufacturers who were ready to give us an open application programming interface (API) to their devices. The current solution of the CAMDASS system works with a video grabber interface. However, working with video grabber causes problems, e.g. a reduced resolution, that can be easily overcome when having available an ultrasound device and a corresponding application programming interface (API). This is one of our priorities to find a manufacturer working with us on this issue. Also there were challenges to be solved regarding the tracking system to track the patient, the ultrasound probe and the head mounted display (HMD). The currently used NDI Polaris system is very nice in terms of accuracy; however, with respect to our scenario, it has line-of-sight problems. The available head mounted displays (HMD) are still in a technical state where they can be easily disqualified when being applied to medical applications. Weight and portability are major issues. There are additional challenges in terms of space qualification of these devices.
How was the demo of your AR system at ISMAR?
Yashodan Nevatia, who was a developer on the project took the entire CAMDASS (Computer Aided Medical Diagnostics and Surgery System) system as it is, i.e. the working prototype, to ISMAR at Basel and presented the system with demos on a phantom head. He showed how the system can guide users to perform a retinal diameter measurement using ultrasound.
When do you expect the CAMDASS system to be used in space?
This strongly depends on the priorities and interests of ESA. Flights and space missions have long planning phases. But we hope that we can install the system as soon as possible!
Thank you for the interview!