Using Artificial Intelligence to Create Virtual Reality Medical Simulations

Using Artificial Intelligence to Create Virtual Reality Medical Simulations

Posted by Ashleigh Benfield | September 19, 2017

Overview
The Children’s Hospital Los Angeles VR Medical Simulation project was initiated to create training and learning modules that are interactive yet non-linear, so medical students are allowed multiple chances to make the right or wrong decisions within a virtual learning environment. The characters are programmed to be responsive and to offer alternative solutions to medical students while they are in VR. Further, these modules were built to be expandable so that additional procedures could be designed and developed off of the same master framework. This met the sponsors’ (Facebook and Oculus) request that the modules could be used and expanded upon by other medical professionals. The CHLA VR Medical Simulation project is currently in use at Children’s Hospital Los Angeles and data on user performance is being gathered and analyzed.

The Beginning
Walking through the halls of Children’s Hospital Los Angeles, it becomes an unforgettable place. In equal measure hope and fear, healing and pain are all around. Down every corridor, in every room, diagnoses are made, test results delivered, blessings counted.

In late 2016 going into February 2017, AiSolve worked with Children’s Hospital Los Angeles (CHLA) to create life-saving medical VR simulations to help train the med students, residents and doctors making diagnoses and administering treatment on medicine’s youngest and most vulnerable patients.

While we’d love to share many compelling side stories and learnings, in this article we will focus on components from AiSolve’s platform VRSims, including artificial intelligence for non-linear VR learning and our data analytics platform.

We share also the journey from problem to solution in medical simulation and training and how AiSolve’s artificial intelligence embedded content platform, VRSims, was utilized to develop medical VR content for CHLA and AI and VR tools for the future of med tech.

Solving A Problem in Medical Simulation and Training
In real life, med students, residents and healthcare providers train in a variety of ways. Some use mannequins, others take part in role-playing scenarios and others still use computer-based applications. Often it is a combination of all three that deliver the lessons on how a real procedure or code will unfold in the hospital. These training methods all have use, but they come at a price, literally and figuratively.

Traditional medical training and simulation costs hundreds of thousands of dollars per hospital per year. CHLA alone spends $430,000 annually. Beyond the expense, sims lack a sense of psychological fidelity, that feeling that students are truly experiencing and participating in a procedure. Plus, typical training and simulation is not highly customizable from one student to the next. VR offers a more affordable, more real and more customizable solution to traditional medical simulation and training.

At AiSolve, we saw the power of virtual simulation when we shipped our first 3D training modules back in 2011. At the time we didn’t have the inexpensive, higher resolution VR headsets available today, but still our vocational skills plumbing sim resulted in faster, better learning by students. Naturally we were eager to help CHLA develop something similar for their institution and would collaborate with some exceptional organizations to help make it a reality.

The Partners
Prior to development kick-off we were brought on board this special VR project and paired with doctors from CHLA, plus Facebook and Oculus, who supported this initiative with funding and hardware. In addition, we worked with BioflightVR, a front-end content developer with a specialty in forensic visual effects and Hollywood CG (computer graphics) for television shows such as CBS’s Crime Scene Investigation.

Each partner brought a special expertise, passion and commitment to the project that eased the development process, which would take place in a rapid seven month window.

Together, using AiSolve’s proprietary VRSims simulation platform as the foundation, the partners worked toward a common goal of creating two advanced VR training modules that teach med students and residents what real life emergency conditions are like inside a simulated virtual pediatric resuscitation room.

The Development Process
The goal for us all was simple: create modules as gamified experiences based on exactly what happens in a real hospital resuscitation room but translated to a virtual world where students and residents can practice over and over again, make mistakes and learn until they master the science and art of diagnosis and treatment. Although the goal was simple, achieving it could either be done the simple way or the complex way.

Many training simulations choose the easy road and as a result lack depth, offering only one fixed route for completion. In doing so they miss one of the most vital parts of the way humans learn: making mistakes and experiencing the consequences. By taking the hard path, AiSolve’s VRSims offered a more natural way to learn, an ability to be creative with approach, a chance to make mistakes in a safe environment, see the consequences of those mistakes and learn from them. Doing this is no small effort. It requires an understanding of how all the moving parts of a simulation interact with each other and what individual parts they play in achieving a learning outcome. It’s not easy, but we believe it makes an incredible difference.

While pre-production, previsualisation and asset creation took place in Los Angeles, AiSolve’s engineering team in the UK got to work customizing the VRSims platform to accommodate the CHLA medical VR simulations. This included in-depth planning sessions to prepare VRSims for CHLA’s specific requirements for simulating a six month old having a seizure (module 1) and a seven year old in anaphylaxis shock (module 2).

In parallel, our partners CHLA and Bioflight created scripts, assets and handled motion capture on real doctors and nurses, whose actions would be translated to character models by our AiSolve team in the Unity engine as well. Every detail, down to the actual medical equipment and instruments, were recreated in LA and handed over to us to integrate and program into our VRSims platform.

Artificial Intelligence For Non-Linear Learning
The virtual characters within the modules would be the most important feature of the modules. Their ability to realistically react and respond to the med students actions in real time would be critical to the simulation’s success. AiSolve’s artificial intelligence framework, which we called GAIA, which is built on top of Unity, would form the backbone for character interaction, as well as other aspects of responsive interactivity within the modules.

For the characters in the room AiSolve used a tagging system we developed inside GAIA for every available action the user could take and also for the various states the patient could be in. These tags are both used for the analytics, and also for the characters in the room to build up a picture of what’s happening. The characters have coding which enables them to constantly analyse the situation and react accordingly.

For example, AiSolve programmed the nurse to have knowledge of what the correct protocol was for solving each module, so on easier levels she could give hints as to what needed to be done next after a certain period of inaction or wrong decision by the user. The nurse also uses her awareness of correct protocol and the current situation in the resuscitation room to determine whether certain actions by the user are too early, too late or inappropriate, as well as taking the difficulty level into account. This is essential to a non-linear simulation module, as the user is able to make many choices, not just the correct steps to adhere to protocol. 

Another area where AiSolve implemented AI using GAIA in the CHLA Medical VR Simulation project was in the way we simulated the human body of the patients in module 1 and 2. Although there are some scripted events in the modules (some impossible to avoid, some for the sake of keeping things within the tight schedule we had), most things that happen to the patient are modeled as a direct consequences of the effects of medications and tools. For example:

 

· The colour of the patient’s skin directly correlates to oxygen saturation, not artificially influenced by pre-scripted events.
· Stabilising the patient could require reducing the patient’s heart rate to normal levels, which can be done in many different ways. A doctor may choose a specific medication, that in this context, would reduce the heart rate by enough to return the heart rate to the desired level, and therefore stabilising the patient.
· Blockages in the throat will cause standard airway tools (ie an oxygen mask) to be near useless, so oxygen saturation will only temporarily increase slightly with their usage. Only advanced airway tools will permanently stabilise the patient by clearing or bypassing the blockage.

In the modules, tools and medications aren’t directly triggering actions and symptoms, instead they’re altering the vital signs of the patient, and in turn those vital signs at certain levels trigger specific symptoms as they would in real life.

This approach allows the CHLA Medical VR Simulation to operate at a much more intelligent level, and it’s evidenced by the fact that we built two modules from the same medications and tools without having to program specific behaviours for each. Translation: for future modules we develop with CHLA or other medical partners, we already have significant work done for us because the platform AiSolve made with VRSims follows realistic behaviours based on real medical science, and wasn’t been custom coded to fit a specific module or its learning objectives.

Using Analytics To Customize Learning
One of the core components of VRSims is the analytics framework AiSolve spent a tremendous amount of time refining since we began our company. Yes, pure software is compelling but software that captures and drives analytics on user engagement levels and rates of learning–that’s software at its most powerful.

For the CHLA Medical VR Simulation project, AiSolve integrated our powerful analytics framework, which is also powered by GAIA, so that it hooked into any areas of code we (or the doctors) choose, gathering raw data. Then, the framework computes relationships between events as well as their context and timing. 

For example, at any one time there are dozens of events being fired off on the software side, causing changes in behaviours and values, meaning the simulation is dynamic and adaptive to the user. Perhaps patient two has to cough, scratch and struggle to breathe all at the same time and the med student/user has to select the right medication to abate the symptoms. As they look for, say, an intubation kit, VRSims is tracking the time it takes for them to select the kit, whether that kit is the right action in the treatment protocol and what the virtual physical response is from the patient.

Judgment, knowledge, accuracy and response times are critical to being a competent physician and the VRSim analytics system tracks a data-centric version of these, then provides personalised feedback to each user and their teacher so they can improve their performance. Along the way, VRSims gathers large amounts of data that power a study being conducted by CHLA.

 

 

The Next Generation of Virtual Learning
This article is just a brief overview of the CHLA Medical VR Simulation project. Our entire AiSolve team had great fun throughout the development process and loved being able to apply VRSims and GAIA for such an impactful application.

At AiSolve, we believe in the power of VR for working, playing and most importantly learning. We’ve been at the forefront of interactive virtual learning since 2010 and are developing VRSims into a tool set for other devs and institutions to use. In this way, AiSolve can continue to expand VR learning beyond our own development and help evolve VR as a learning tool in schools, hospitals, businesses and enterprises.

It’s important to acknowledge that the work has just begun though. With partners like Facebook and Oculus supporting projects like the CHLA Medical VR Simulation pilot, using their products to shine a spotlight on this area of VR development, we know that the future for learning in VR is bright.

It was an honor to collaborate with all our partners on the CHLA Medical VR Simulation project and look forward to working with them in the future. If you’re an organization interested in building and deploying medical VR or MR content, or enterprise VR and MR content, we encourage you to be in touch to discuss how AiSolve can support your development through VRSims and GAIA.

Related Blogs

Posted by ashleigh | October 2, 2017
Virtual reality specialist AiSolve and Central Bedfordshire College are launching the UK’s first dedicated immersive skills training programme and facility, the Immersive Training Institute, in Luton. A launch event, co-hosted...
Posted by jay | August 15, 2017
AiSolve is thrilled to announce we’re attending this year’s Euro Attractions Show (EAS) in Berlin, Germany. We’ll be showcasing our latest themed interactive attraction, WePlayVR. WePlayVR is our location-based, experiential...
Posted by jay | August 11, 2017
We’re proud to announce that we’ve been nominated for Best Use of VR in Healthcare at the 2017 VR Awards. With over 450 global nominations, it’s an honor to be...