From my point of view, and others might have a different opinion on this, the full immersion of a user into an Augmented Reality world requires two levels of user interfaces. The first one is designed along with the hardware that technically enables the fusion of virtual and real objects. The second level of user interfaces allows the user to interact with this composed world. Control devices of this second level can be part of the Augmented Reality scene again either as a virtual or a real input device. The design of both UIs needs to be driven by the required interactive tasks, the workflow and the working environment of the designated application.
When Augmented Reality supports surgical applications, the designers of those user interfaces need to take the factor sterility into account. Around the surgeon, there is the sterile area in the operating theatre and anything being or coming inside this area needs to remain sterile. For instance, introduction of a navigation system, which is partially outside this sterile area, e.g. the tracking cameras or the computer running the navigation software, can become a severe problem when you want the surgeon to take over the full control of this system. For example, pressing a button on the PCs keyboard to change a certain view of the navigation can become an impossible tasks for the operating surgeon. He would need to ask a nurse working in the non-sterile area of the operating theatre to do that adjustment. When the system allows for more complex configurations, frustration due to communication problems becomes inevitable: “Press that button! NO! The other one!”, “The left one?”, “Right!”, “Now, ok?”, ”NO, I meant the right button!”
Researchers have introduced pedals, voice recognition, touch interfaces, gesture tracking e.g. with Kinect etc. but every type of those interfaces has its drawbacks. In most cases they require already reserved space, or hands or voice of the surgeon, which are actually are blocked already for other tasks.
Tomas Brusell, dentist surgeon and CEO of Brusell Communications, is developing an input device that belongs to the second level of user interfaces in case it becomes part of an Augmented Reality application.
Tomas, can you describe your input device?
Lipit is a patented, truly hands free, voice free and mobile input device and would fit the bill for Health Tech biosafety improvements. It’s an extra oral, mini touchpad touched by the lip to gain full control of the ICT.
Tell me more about your ideas how to apply it in clinical scenarios and in combination with Augmented Reality!
I know what I can do with Lipit, behind the face mask, in the rather safe dental environments and I have a vivid vision of what could make Lipit a preferred UI in emergency situations. Post mortem the surface of the Ebola patient bead is high risk material and touching anything in the vicinity of an infected is a question of high alert mobility planning. In combination with emerging Augmented Reality devices, Lipit will make information and communication more intuitively accessible. There are many apps conceivable for Augmented Reality goggles technologies, but it’s obviously a hassle with input. With our device there is no need for fingering on a touchpad or screen, no embarrassing gestures or foolish talking to the machine, to achieve 100% biosafety. Regarding Ebola patients treatment, with Lipit combined with Augmented Reality, e.g. transparent (HUD) screens, you are still able to concentrate on where you touch things and humans around you, and you can talk to colleagues in a normal way. You can fully focus on what’s best for patient treatment and the hygiene control. I’m convinced that our technology in combination with AR technology can help medical combatants to do a better and more bio safe job.
Did you test it already in a clinical environment? Is it already available on the market?
At my clinic I have tested Lipit for 15 months – the technology is stable, no Bluetooth or software bugs. Android and Windows PC applications are available and the prototype is ready for production as a standalone device and in combination with Augmented Reality technologies. It took us almost 10 years to take the project from my idea to a working prototype, and I’m just about to reach out and declare that our technology now is ready for industrialization and that we are ready to talk licensing/cooperation with an AR/VR player. Currently I’m looking for the right individuals, playing in the emerging Augmented Reality forefront. People understanding, how crucial the UI is for optimal convenience and precision in the Augmented Reality control. People designing and developing the protection gear and people supplying the Augmented Reality technology.
Thank you for your time!
Recently, a video of a LinkedIn post grabbed my attention that introduced a tablet-PC based Augmented Reality system to be used for training clinical staff how to interact with patients.
Simulating complex working situations that require a combination of social, technical and team skills is a highly interesting application domain for Augmented Reality. Nurses and further members of the clinical staff being in charge of patient care are confronted with this type of scenarios on a daily bases. Augmented Reality gives trainers full design freedom to generate endless training scenarios in order to reflect real working conditions.
Katy Russell, Business Development Manager at Campus Interactive has kindly answered my small list of questions about this application and the related Augmented Reality App.
Katy, can you describe the intention of your app and the target audience in the medical field?
The purpose is to create empathy and enhance caring whilst delivering the technical skills nurses must provide. By introducing patient scenarios using AR, nurses feel much more connected to what they are doing. It’s much better than a teacher explaining a patient scenario. The target audience is medical, nursing and healthcare schools where training is required on Sim Man.
How do you achieve the overlay/registration of the dummy and the videos showing the actors?
We started by editing the films to remove the backgrounds so the real background could be seen in AR view. This is important so the nurses can see the monitors etc. Originally we set-up a point cloud and aligned the films to the dummy this way. However, we are now using face tracking technology.
Did you do user studies, evaluations that prove the benefit of your approach?
Yes – Mel Lindley , Deborah Clark & Robin Gissing from the University conducted an evaluation. (Author info: the following figure shows the outcome of the study)
What are the important aspects that can be better learned with your app?
This augmented reality use case is available to all our partner institutions and can be adapted for all medical scenarios, dentistry scenarios, sports therapy scenarios, first aid scenarios and even veterinary scenarios! We tailor make the videos to suit the cultural circumstances of the area where nurses are working. Accents, use of language, slang etc. are incorporated for authenticity.
Thank you for your time!