Sharing Experience With Hololens

Yesterday, I finally got the chance to test the Augmented Reality see-through head mounted display Microsoft Hololens at the open day event of the Navigated Augmented Reality Visualization System (NARVIS) Lab, a research laboratory the Chair for Computer Aided Medical Procedures & Augmented Reality, TUM and the Klinik für Allgemeine, Unfall-, Hand- und Plastische Chirurgie, LMU in Munich, Germany.

IMG_20160727_205653_resized_20160728_055719368

I have to say that my expectations were not too big. First, because I knew that it is an optical see-through device, i.e. you see through some semi-transparent glasses the real part while the virtual part is projected onto this glass layer. This does not allow for seamless integration of virtual objects into the real environment and reduces the effect of immersion into the Augmented Reality scene. Second, because I was really disappointed by Google Glass.

But I have to say, the live test with the Hololens definitively made me happy. The Hololens is light-weight and looks stable and also quite stylish. It is more comfortable than any other stereo head worn AR device that I have tried before (mostly prototypes in a research phase of course) – also when been worn for a longer period in time.

The virtual 3D objects are indeed semi-transparent, so it is different as shown in the demo videos presenting also completely opaque objects.

 

But the robustness of registering virtual objects is really great. The virtual object stays at its designated position in space without any jittering. No noticeable time lag, which would cause a delay and swimming object in space. This my of course change once more complex scenes need to be rendered.

The effect of semi-transparency that I had in mind before the test, finally lost its presence. Actually, I did not notice the effect anymore after a while. I need to do further testing on this once demos are available that allow to look through real objects, e.g. through a wall or through the skin of the patient.

Then I noticed that I can use my fingers and hands to interact with the AR scene using gestures. With a certain gesture the system was able to show the reconstructed virtual objects of my surrounding environment and made the power of sensors of the device and huge amount of spatial information that is collected by the device in real time impressively visible.

I’m really looking forward to see this device getting into the hands of all those creative people out there. Possibilities are endless. This is finally the very first device that allows the creation of AR worlds presented from the natural perspective of users for not only a small group of researchers but for all of us.

Let’s see what Magic Leap is going to introduce soon. Hopefully they are playing in the same league as Hololens.