See-through head mounted displays (HMDs) allow the user of an Augmented Reality system to perfectly immerse himself into the augmented world thanks to observing the virtually extended scene from his natural perspective. It further supports the user’s hand-eye coordination that we use in our daily tasks and that we have trained and intuitively apply since our infancy. In December 2012, I got the honor to do an interview with Prof. Henry Fuchs, University of North Carolina at Chapel Hill, USA. Prof. Fuchs is one of the pioneers in Medical Augmented Reality. In the early nineties, he worked on a project having the objective to display ultrasound data in-situ, i.e. at the place where it was acquired. The final Augmented Reality scene was then shown with a head mounted display (HMD). Many of the problems that had been identified in the course of this project still have not been solved and are still subject of ongoing research. The interview was held on December 20, 2012. (Images/screenshots were taken from videos being accessable http://www.cs.unc.edu/Research/us/web/quicktime.htm )
Prof. Henry Fuchs is the Federico Gil Distinguished Professor of Computer Science and Adjunct Professor of Biomedical Engineering at UNC Chapel Hill. He has been active in computer graphics since the early 1970s, with rendering algorithms (BSP Trees), hardware (Pixel-Planes and PixelFlow), virtual environments, tele-immersion systems and medical applications. He received a Ph.D. in 1975 from the University of Utah. He was a member of the faculty of the University of Texas at Dallas from 1975 to 1978. He joined the faculty at the University of North Carolina at Chapel Hill in 1978. He is a member of the National Academy of Engineering, a fellow of the American Academy of Arts and Sciences, the recipient of the 1992 ACM-SIGGRAPH Achievement Award, the 1992 Academic Award of the National Computer Graphics Association, and the 1997 Satava Award of the Medicine Meets Virtual Reality Conference.
(Biography taken from http://www.cs.unc.edu/~fuchs/)
Dear Prof. Fuchs, recently, I found that video on YouTube – “The machine that changed the world” – of 1992 about the history of computer science. In that video, you were talking about HMDs. Can you describe a little bit the spirit of that time regarding computer science?
Special about the early nineties was that time had finally come that subtechnologies had developed enough so that there could be breakthrough in total system solutions. In particular the interactive 3D graphics, the 3D graphics engines, were finally fast enough. And they were cheap enough so that many laboratories could afford them. Before the 90ties, you had to have a really large lab to do graphics in real time. But in about the 90ties, it was possible to do it with small machines, machines that a lot of people could afford. And this combined with the other technologies like as tracking and small displays made it possible to get, what we now call, Virtual Reality. Then Virtual Reality was not just a research topic at one or two places in a country anymore, but it began really to flourish. An indication of this was that Virtual Reality found its way into the public consciousness.
Was there a particular person or group who pushed Virtual Reality at that time?
I think the person who should be most credited with this is really Jaron Lanier. In the early 90ties, his little company came out with the first Virtual Reality system that you could just buy and turn on and use. Before that, you had to build it yourself. When he started to talk about this, the general press got ahold of this and it was a lot of interest in this topic. I remember that we got many more calls by the press in the early 90ties than we had before even though we had been working in this area since the 1970ties. I thought that the 90ties would see a breakthrough in Virtual Reality. But I think there was an overenthusiasm for it. And I think by around 2000, people became somewhat disenchanted that it did not deliver what people had thought.
What was the missing part so that the breakthrough did not happen?
Different people might have different opinions about this. In my estimation, there were several missing parts. The most obvious was the lack of an effective and a comfortable head mounted display. It was possible to do the other things reasonably well. The graphics, I think, were fine. The head tracking was sort of ok. Interaction and model generation was sort of ok as well. But there was nothing really good that you could wear. So the head mounted display was the major thing that was lacking and I think it is still lacking right now. So, Virtual Reality and in particular Augmented Reality has not taken off. I’m hoping that the forthcoming Google’s Project Glass will start a renaissance in wearable displays.
Around 1992 you filed a few papers about an Augmented Reality System being designed for ultrasound supported patient examination. How did you get inspired to start a research project having this topic?
My wife was having our first child in 1989. And it was so striking to me that the physician, who was an excellent physician, was looking to the side on a TV screen while he was scanning her belly. And of course, I had been interested in head mounted displays since 1969, for me it was just obvious that he should be looking underneath the ultrasound probe. That’s where the data is acquired. I remember that I was telling her that we probably won’t be able to have anything ready for her at this pregnancy but maybe for another child it might be ready. That was then 1991 and it was not ready even then.
Did you develop the system together with surgeons?
Yes, we worked initially in fact with an obstetrician and gynecologist. Then we worked with radiologists and four to five different doctors on different applications of this system that were all the way from obstetrics to breast biopsy to liver biopsy.
Why is it so important to have surgeons in a team when developing a high level medical device?
What is important is to try to get an understanding what the actual physician needs and what the medical possibilities are. It’s not like that physicians need to be around in every meeting, but I think we need to be close enough to them so that, as we develop technology, we understand that we are heading in the right direction and not maybe solving an abstraction of a problem. Their presence helps a lot to not lose track of the original conception of what the problem is. Occasionally, on the way to a solution, we managed one way or the other, to solve the problem of the physician. And he goes away happy with that solution while we were still working on some aspects of the technology that fascinates us for another reason.
What were the major challenges during development of that ultrasound based Augmented Reality System?
From my recollection, all problems were technical. Various parts of the technology were not working well. A lot had to do with the visualization of the ultrasound data. Unfortunately, it was not good enough to be able to see the ultrasound data in good enough detail to be able to make medical decisions. That was the biggest problem.
At one point, we decided to apply for research funding to be able to develop our own HMD, because all the ones we could find were so rudimentary. So we spend some time developing what we thought would be an adequate see-though HMD. Here, we were in close collaboration with our physician to try different prototypes of the HMD to be applied to train the acquisition of ultrasound images of the human breast. There were these silicone breast models that are solid or water filled. They are used for training physicians to be able to do ultrasound examinations and to be able to accurately take samples of breast tumors. The AR system that I recall worked well even with those training phantoms. Then we started testing it on living patients, which took a lot of preparation to get permission to do clinical trials. We had to have a place that can be kept sterile, that is a private room and is next to our lab. We needed to have an assistant for the physician. We had to have emergency people in place in case the patient has some kind of allergic reaction or something else goes wrong. So there is a lot to do. When for the first patient all was arranged, I remember that this physician, who was very experienced, tried the HMD with the first human patient. She said: “Everything is fine, but – you know, I don’t see the tumor well with this.”
We were all trying it then right there. The ultrasound image taken with the same ultrasound scanner that we/she had used with the breast models before was just not as good. The distinction between what was normal breast tissue and tumor was much vaguer than it was in the training phantoms. And in retrospect, I should have checked that before… But we didn’t. So this failure set us back a lot. I felt so bad about it. I remember not wanting to go to the funding agency for another round of money – which was a real mistake! So, the problem was that we got to the point of an actual patient study; we did not succeed and then ask for more money just to try to do the same thing…. It was a real setback. But there was nothing better that we could find at that time.
What alternative medical applications of HMDs and Augmented Reality in general can you think of?
Training of course is a very broad field, beginning from medical students to continuing medical education, and recertification. I think there are a lot of possibilities for it, if we get a reasonable optical see-through HMD in the next few years. I think there is lots of possibilities for training even in basics sciences, in anatomy, in various beginning and more advanced surgical procedures, for advanced training, especially, for procedures that are specialized in cases that are so unusually so that physicians won’t see it very often. I’m personally interested in telepresence. So I think remote medical consultation might be useful. This is one of the things that we will get to with our current project “BeingThere”. “Being There” is basically a collaborative research center, that has three sites; NTU Singapore, ETH Zürich and UNC Chapel Hill. There are five projects within the center right now.
So you stopped working on Medical Augmented Reality topics for the moment?
The reason, why I hesitate currently to work with Augmented Reality is because I think the HMDs are such an impediment. But I want to get back into it. There is not only Google’s Project Glass, but there are several new designs, which are pretty good. I’m particularly excited by the one of Lumus. But I have not received it yet. I saw this technology in a research lab about 12 years ago and they started a spin-off company now. This is a waveguide optical see-through HMD and its optical path is perfectly clear.
Is for you HMD the main interface for Augmented Reality?
Not necessarily. We tried other things, but it is so much less exciting for me. There, you do not have the same hand-eye coordination. There is a spin-off company of my lab (http://www.inneroptic.com/). They are not using HMDs because these devices are still not good enough, but they use standard large format stereo displays. And that’s probably the best that can be done at the moment.
Do you see, e.g. in 10 years, HMDs to be used in the operating theatres?
Yes, I think so. There is a good chance that then there is much more sophistication about guiding light through displays. Lumus is perhaps the first example for it. These new devices might also solve the problems of lacking hand-eye coordination that you have now with alternative AR systems setups. People could make excuses with this and say that you drive cars and you don’t pay attention to the steering wheel, but you pay attention to the road. And that’s what essentially happens now with AR. Medical people move their hands near the patient and look at the display that is at a different part of the room. But in most of our manual tasks in our daily life, we expect to see things we are interacting with right where our hands are. That’s what we were built for.
I think there are plenty of inspirations for this kind of the thing. Think about the initial ideas of personal computers, at least as they were introduced to me by Alan Kay. In about 1969, Alan started talking about a dynabook, something that could be a personal computer of the size of a book. The first one of these was called the “Interim Dynabook” that Kay and a bunch of people built. The first one of these laptops in fact wasn’t a laptop at all. It was rather something that we call now desktops. So we had a whole generation of Macintoshes, Windows Pcs and others, which were basically desktop machines. But, as soon as the technology became advanced enough so that you could put all the capability in a laptop, people of course wanted to have a laptop. Because it is much better for personal computing to take the computer with you. So, as far that I know, most people needing a personal machine don’t buy desktops anymore. They buy laptops. So think in the same way. If you could have a display that is in your eye glasses as a personal display, you would not have one that is large and sits on the other side of the table. Well, for some specialized reasons this will be still the preferred solution, e.g. when you want to share displayed content with others. And you still buy for specialized reasons today desktops. But for most of the general applications you buy laptops, and I think the same thing is going to be true for HMDs. I have this iPhone that sits in my pocket all the time and I take it out just to look at the time. Why this information can’t be displayed on my eyeglasses since I wear my eye glass all day? You want to have a personal display – you want to have it in your eyeglasses.
Thanks a lot for your time!