Projects Archive

KneeVIEW: A Virtual Education Window for musculoskeletal training (Fall 2017)

Collaborators: Mariana Kersh, PhD, Scott Barrows, MA, FAMI, Dr. Thomas Santoro, David Dominguese, PhD, Anthony Dwyer, Joel Baber, Grace I-Hsuan Hsu, B.Sc., ALM, MS, Meenakshy Aiyer, MD, FACP
Despite the increasing prevalence of orthopedic injuries, clinicians are poorly equipped to treat musculoskeletal problems. Musculoskeletal training is ineffective due to limited exposure to clinical patients resulting in a lack of organized clinical instruction. This project aims to develop a realistic knee simulator model, supported by virtual reality and augmented reality educational modules, to enhance clinician training and improve patient outcomes. The biomechanically accurate model will replicate the stiffness of each individual component of the human knee to simulate both normal and pathological cases.

Multi-modal Skin Lesion Identification & Education Simulator (Fall 2017)

Collaborators: Scott Barrows, MA, FAMI, Stephen A. Boppart, M.D., Ph.D., Thomas Golemon, MD, Brent Cross, BS, MS
Current simulated skin and models of skin lesions used in education are unrealistic in both visual and tactile characteristics. This project aims to create a skin simulation model with realistic appearance and texture. In the project’s first phase, the model will consist of 2D surface images of skin lesions displayed on a tablet computer with a translucent elastomer overlay replicating the surface topography of the lesion. Future efforts will seek to extend the model to 3D and incorporate additional features.

Interactive Technology Support for Patient Medication Self-Management (continued funding Fall 2017)

Collaborators: Dan Morrow, PhD, Suma Pallathadka Bhat, PhD, Mark Hasegawa-Johnson, PhD, Thomas Huang, BS, MS, ScD, James Graumlich, MD, Ann Willemsen-Dunlap, PhD, Don Halpin, EMBA, MS
Electronic health record (EHR) systems are underutilized by chronically ill adult patients. A barrier to patient/provider collaboration and self-care via EHR systems is that information in EHRs is technical, not patient-specific. This project aims to develop a natural language processing tool to translate technical information in the EHR into patient-centered language. A prototype translation algorithm has been created, with preliminary results showing the translation is both accurate and easier to understand. Development of a conversational agent (CA) system using an animated avatar to deliver the patient-centered language is also underway. Goals for further development are refinement and expansion of the translation tool and CA capabilities, including making the CA interactive and able to ask and respond to questions.

AirwayVR Virtual Reality based trainer for Endotracheal Intubation (Fall 2017)

Collaborators: Pavithra Rajeswaran, Praveen Kumar, MBBS, DCH, MD, Eric Bugaieski, MD, Priti Jani, MD, MPH
Endotracheal intubation is a procedure with risks of severe complications; this risk has been found to be associated with experience.  This project seeks to develop a stable, immersive, high quality, low cost VR simulation trainer for learning and practicing intubation. It will create a curriculum for intubation training that uses a VR trainer featuring 3D models of the head and neck and other interactive learning tools. VR input will be provided by a 3D printed laryngoscope as a VR controller. Validation studies will be performed to assess the impact of the VR trainer in intubation training.

Simulation Training for Mechanical Circulatory Support using Extra-Corporeal Membrane Oxygenation (ECMO) in Adult Patients (continued funding Fall 2017)

Collaborators: Pramod Chembrammel, PhD, Matt Bramlet, MD, Pavithra Rajeswaran
Widespread adoption of extra-corporeal membrane oxygenation (ECMO) in adults is limited by the difficulty of deployment of cannulae. To address this deficiency, this project aims to build a physical simulator for training ECMO. The trainer will use customized mannequins with flexible vasculature and a programmable pump to simulate the circulatory system. This artificial vasculature will be integrated with the BioGears physiology engine to control simulated physiological parameters. The physical components will be manufactured by 3D printing. Simulation experts and ECMO-experienced surgeons will evaluate the simulator’s performance.

A Natural Language Powered Platform for Post-Operative Care for Long Distance Caregiving (Fall 2017)

Collaborators: Ramavarapu Sreenivas, MS, PhD, Sarah De Ramirez, MD, MSc, Kesh T. Kesavadas, PhD
A 2011 study found that patients with severe postoperative grade IV complications cost the US $159,345. This projects aims to diminish these costs by using a Natural Language Powered Platform that patients can verbally interface with. The project consists of three phases: coding voice-commands to fulfill postoperative protocols and test it in a VR environment, connecting the platform to sensors to see if it can process motion assessments and testing these in the VR environment, and conducting studies with test patients at OSF.

Heart Failure & Behavior Change: Patient/Provider Interactive Clinical Education App for Mobile Devices (Fall 2017)

Collaborators: Scott Barrows, MA, FAMI, Wawrzyniec, Dobrucki, MS, PhD, Barry Clemson, MD, Kyle Formella, Don Halpin, EMBA, MS, Ann Willemsen-Dunlap, PhD
Heart failure (HF) is a complex physiological ailment that requires high cost interventions to manage. However, it has been shown that clear communication during the process improves patient outcomes and decreases human and financial burdens. This study aims to use a mobile app to support patients with Stage A, B, and C of HF. The aims of this project are to use a literature search and needs analysis to determine gaps and barriers, revise and add interactive 3D visual assets for the application, develop a repository of information of HF to be housed in the app, and begin integration of conversation agents developed through previously-funded ARCHES projects. Desired outcomes for the project are improved communication and understanding of HF for patients and improved adherence to treatment by patients.

Flexible, low-cost, Single Port Minimally Invasive robotic Surgical Platform (Fall 2017)

Collaborators: Placid Ferreira, PhD, Kesh T. Kesavadas, PhD, Nicholas Toombs, Fanxin Wang, Xiao Li, Jorge Correa
Minimally invasive robotic Single Port Laparoscopic Surgery (SPLS) has allowed for surgeons to perform various complex procedures with less burden on the patients. The downside to these robotic systems is an increased economic, maintenance and operation burden, resulting in limited hospital access. This project aims to improve upon their SPLS prototype to develop a cheaper, portable and more flexible device to address those issues. Three advancements in the field with the prototype have been demonstrated. Adding three more improvements will increase the adaptivity of the device and lower the price to an affordable point for middle class hospitals.

Interactive Mixed Reality (IMR) based Medical Curriculum for Medical Education (Fall 2017)

Collaborators: Kesh T. Kesavadas, PhD, David Crawford, MD, Meenakshy Aiyer, MD, FACP, Jessica Hanks, MD, John Vozenilek, MD
Clinical education and training is a highly complex area, and strides have been taken to improve upon the pre-existing methods of teaching. This project aims to combine the strengths of Jump and HCESC to develop a highly interactive platform for learning that uses Interactive Mixed Reality, a combination of Virtual Reality and 360-degree video. The hope is to eliminate the barrier of the simulation technical skillset so that instructors can easily develop educational content. Future goals of the platform are to provide an easy, immersive and portable method for adult professional learners to maintain, acquire and improve current knowledge while maintaining communication between them and healthcare education centers.

Simulation of postural dysfunction in Parkinson’s disease (Summer 2017)

Led by: Manuel Hernandez from U of I, Dronacharya Lamichhane, MD from OSF HealthCare and UICOMP and Richard Sowers from U of I.
Falls are a prevalent and significant problem in people with Parkinson’s disease that is associated with gait and balance impairment. Balance impairment in Parkinson’s disease and the unique contributions from anxiety are poorly understood and difficult to treat.
This team is using a unique test of balance to gain a greater understanding of the coordinated activity of the body and brain, the disruption of this coupling that results from Parkinson’s disease and the influence of dopaminergic therapy.
Using virtual reality, this work will provide health care practitioners with a new tool for use in long-term monitoring of disease progression and drug treatment efficacy relevant to a wide range of motor disorders. In addition, it will serve as a platform for simulating the effects of altered sensory and motor integration function to the health care practitioners of tomorrow.

Movement impairment characterization and rehabilitation for dystonic cerebral palsy using robotic haptic feedback in virtual reality (Summer 2017)

Led by: Citlali Lopez from U of I and Julian Lin, MD from OSF HealthCare and UICOMP.

Cerebral palsy (CP) is the most common movement problem in children. 10% of children with CP have dystonia and seek medical assistance at higher rates than other forms of CP. Dystonia is a movement disorder with involuntary muscle contractions the cause twisting and repetitive movements, abnormal postures, or both. There is no cure for dystonia and rehabilitation exercises are unknown.

The team working on this project is developing a non-invasive, game-like intervention for patients with dystonic-CP using virtual reality and haptic feedback. The goal is to improve clinical motor scores.

This game-like tool will also double as a training implement for medical practitioners in the identification of complex presentations of motor disorders, not limited to CP.

Multi-modal medical image segmentation, registration and abnormality detection for clinical applications (Fall 2016)

Led by: Thomas Huang from U of I and Matthew Bramlet, MD from OSF HealthCare and UICOMP

This team is developing an automatic 3D segmentation method, making it easier to separate out images of particular organs from an entire 3D rendering. As a result, physicians will be able to better detect abnormalities in medical images.

Developing MRI acquisitions and protocols to enable automated segmentation of cardiac and brain images (Fall 2016)

Led by: Brad Sutton from U of I and Matthew Bramet, MD from OSF HealthCare and UICOMP

In this project, researchers will develop an imaging protocol that will help physicians get a better picture of the heart and brain. Work will focus on providing maximal differentiation of different tissue types in the brain and heart of patients undergoing MRI diagnostics. This will result in several acquisitions that, when combined, provide maximal tissue separation in a multidimensional histogram. Using open-source algorithms, they will develop processing scripts that automatically create segmented and labeled models of the tissue types and states in a 3D structure of the heart.

Interactive technology support for patient medication self-management (Fall 2016)

Led by: Dan Morrow from U of I and James Graumlich, MD from OSF HealthCare and UICOMP

Researchers are developing a natural language processing tool that translates technical medication information into patient-centered language in electronic medical records (EMR). The group involved in this project is integrating patient-centered language into a conversational agent (CA)-based "medication adviser" system that supports collaboration and emulates best practices gleaned from face-to-face communication techniques. The researchers also will engage patients by developing interactive capabilities, such as using “teachback” when communicating with patients.

Surgical planning via preoperative surgical repair of next generation 3d, patient specific, cardiac mimic (Fall 2016)

Led by: Rashid Bashir from U of I and Matthew Bramlet, MD from OSF HealthCare and UICOMP

This team is working to improve care for pediatric cardiac patients. Researchers will leverage CT imaging and segmentation approaches to create new models for printing 3D infant hearts that mimic the structure, material properties and physical defects of tiny patients. Physicians will use the 3D models to practice surgical techniques and then use imaging methods to evaluate the effectiveness of the procedure.

Multi-Robot minimally invasive single port laparoscopic surgery (Fall 2016)

Led by: Placid Ferreira from U of I and Charles Aprahamian, MD from OSF HealthCare and UICOMP

This team is working to develop a new robotic platform that enables high-fidelity digital simulation, which will facilitate easy surgical training for clinicians. The robot will allow surgeons to translate the dexterity, torque and triangulation capabilities of the human in-vivo and will offer a high level of configurable and customizable methods for different surgical procedures. In addition, the robot will be portable and easy to use in field and emergency operations, as well as potentially low cost.

Abnormal Muscle Tone Behavior Diagnostic Device - Year 2 (Fall 2016)

Led by: Elizabeth Hsiao-Wecksler from U of I, Steven Tippett from Bradley University and UICOMP, Randy Ewoldt from U of I and Dyveke Pratt, MD from OSF HealthCare

This project will create a novel robotic training simulator that will helps learner differentiate between abnormal muscle tone behaviors which can help diagnose different brain lesions such as stroke, Parkinson’s disease, cerebral palsy, or multiple sclerosis.

Virtual reality system of Patient Specific Heart Model medical education and surgical planning (Summer 2016)

Led by: Lavelle Kesavadas from U of I and Matthew Bramlet, MD from OSF HealthCare and UICOMP

Currently, doctors are using 2D tools and images to visualize a child’s 3D heart and make important surgical decisions. Because of the complex intra- and extra-cardiac relationships and connections, this imperfect method makes it difficult for doctors to accurately diagnose a patient. Researchers at the Health Care Engineering System Center at U of I and Jump Simulation, a part of OSF Innovation are using 3D immersive virtual reality technology to help solve this problem. They have created an intuitive model generated from patient-specific MRIs using stereoscopic 3D head-mounted displays.

Safety and Reliability of Surgical Robots via Simulation (Spring 2016)

Led by: Ravishankar Iyer from U of I and David Crawford, MD from OSF HealthCare and UICOMP

In 2015, researchers at Illinois, MIT, and Rush University Medical Center reported that surgical robots had caused 144 deaths in 14 years. Now, computer engineers at Illinois and surgeons at OSF Saint Francis Medical Center in Peoria are collaborating on new research to improve the reliability and safety of minimally invasive robotic surgery.

This research will create platforms for simulation of realistic safety-hazard scenarios in robotic surgery and develop tools and techniques for the design and evaluation of the next generation of resilient surgical robots. The work will help improve not only the safety of robotic surgical systems, but also simulation-based training of future surgeons.

Patient Discharge Process and Communications Simulation Training (Spring 2016)

Led by: Deborah Thurston from U of I and Richard Pearl, MD from OSF HealthCare and UICOMP

About 20-25% of patients discharged from hospitals are readmitted within 30 days, costing roughly $42 billion dollars per year to insurance providers, according to the Agency for Healthcare Research and Quality. These costs are now the responsibility of Accountable Care Organizations (ACOs) like OSF HealthCare.

In some cases, patients are discharged too soon or with inappropriate treatment. Or patients may not understand and/or comply with discharge instructions such as how they are supposed to take their medications and what levels of activities they are able to do. There are a variety of proposed tools and techniques available to reduce readmissions, but there is no holistic system addressing the issue.

A framework is being developed, as part of ARCHES funded research, which will help define the complexity of the total patient discharge system and allow hospitals to evaluate new technology, policy, and communication systems in the construct of training simulation strategies.

Simulation training for mechanical circulatory support using extra-corporeal membrane oxygenation (ECMO) in adult patients (Spring 2016)

Led by: Pramod Chembrammel from U of I and Matthew Bramlet, MD from OSF HealthCare and UICOMP

This team is developing a simulator to train surgeons in using extra-corporeal membrane oxygenation (ECMO) to provide artificial oxygenation to blood cells. This skill, which is difficult to perfect without practicing on real patients, helps save failing hearts and lungs during a surgery. The researchers are modifying the DR DopplerTM blood flow simulator, which simulates blood flow in the vasculature, to develop a working prototype where the blood flow changes colors based on oxygenation.

Download PDF Fact Sheet

Simulation Training to Identify Fall Risk in the Home Environment (Spring 2016)

Led by: Rama Ratnam from U of I and Julia Biernot, MD from OSF HealthCare

Falls are a leading cause of serious injury and death in the elderly. There is a need to find a cost-efficient and easy means of evaluating fall risks, identifying muscle weaknesses, and establishing the potential for loss of balance in the home. Further, there is an equal need to train clinicians to evaluate elderly patients at risk for falling, and to better identify fall risk from postural and movement analysis.

Engineers with the U of I have developed a home-based tele-rehabilitation system that is inexpensive and capable of accurately recording and analyzing posture and balance during movement transitions. Researchers will test the validity of this system against a standardized method of determining fall risk.

The goal is for the system to allow for targeted intervention in an individual’s home and to better train clinicians in fall risk assessment, offering unparalleled opportunities to examine body dynamics in great detail and better understand postural control.

Abnormal Muscle Tone Behavior Diagnostic Device - Year 2 (Fall 2015)

Led by: Elizabeth Hsiao-Wecksler from U of I, Steven Tippett from Bradley University and UICOMP, Randy Ewoldt from U of I and Dyveke Pratt, MD from OSF HealthCare

This project will create a novel robotic training simulator that will helps learner differentiate between abnormal muscle tone behaviors which can help diagnose different brain lesions such as stroke, Parkinson’s disease, cerebral palsy, or multiple sclerosis.

Personalized Avatars In Patient Portals (Fall 2015)

Led by: Thomas Huang from U of I and Ann Willemsen-Dunlap from OSF HealthCare

A 3D audio-visual avatar capable of showing appropriate emotions as controlled by health care providers that will be used in online patient portals to help patients understand their specific medical information, such as test results and medical guidance.