Projects Archive Past

January 2020

A HUMAN FACTORS APPROACH TO FOOD SECURITY

Collaborators: Dr. Sarah Stewart de Ramirez, OSF HealthCare and Abigail R. Wooldridge, University of Illinois Grainger College of Engineering

Thirty-seven million Americans have food insecurity which results in poor health and increases health care costs. This project will use a human-centered approach to identify barriers for individuals who are food insecure and challenges for service providers who are trying to meet needs in rural communities. That research will support design of technology-based solutions to reduce food insecurity in rural areas.

A-EYE: AUTOMATED RETINOPATHY OF PREMATURITY DETECTION AND ANALYSIS

Collaborators: Dr. C. Reddy, OSF HealthCare and Thomas Huang, U of I Beckman Institute for Advanced Science and Technology

Early detection of retinopathy in premature infants is important for early interventions to prevent blindness. With a shortage of specialists, it’s critically important to develop an AI diagnostic system that autonomously analyzes images of the retina to detect retinopathy. The team will also consider how to integrate the tool into portable, user-friendly equipment with the possibility future expanded uses for such a medical device.

ACTIVATE CAPTURE AND DIGITAL COUNTING (AC+DC) TECHNOLOGY FOR ULTRASENSITIVE AND RAPID CHARACTERIZATION OF miRNA BLOOD BORNE BIOMARKERS FOR ALS

Collaborators: Dr. Vahid Tohidi, OSF HealthCare and Brian Cunningham, U of I Grainger College of Engineering

ALS is a devastating condition that leads to gradual muscle decline caused by loss of motor neurons in the brain and spinal cord. It’s in urgent need of new treatments. The goal of this proposal is to develop and validate nanoparticle technology that can use a small amount of blood plasma to identify miRNA biomarkers of ALS. The team will also develop an instrument using just a drop of blood to detect statistically significant circulating biomarkers to identify genetic indicators of ALS.

AI AUGMENTED PORTABLE PHOTOACOUSTIC IMAGING SYSTEM FOR EARLY DIAGNOSIS OF BREAST CANCER

Collaborators: Dr. Kent Hoskins, OSF HealthCare and Yun-Sheng Chen, U of I Beckman Institute for Advanced Science and Technology

This research aims to harness artificial intelligence (AI) to develop an affordable, portable imaging solution for breast cancer screening and diagnosis that could be more accessible to residents in rural communities. The team is proposing to use photoacoustic (PA) imaging techniques that combine optical (photo) and ultrasound (acoustic) approaches to produce high-contrast, molecular images of breast blood vessel and lymphatic systems for early breast cancer diagnosis.

AUTONOMOUS MORPHING BED MATTRESS FOR ALS PATIENTS WITH LIMITED MOVEMENT ABILITY

Collaborators: Dr. Christopher Zallek, OSF HealthCare/University of Illinois College of Medicine Peoria and Elizabeth Hsiao-Wecksler, U of I Grainger College of Engineering

This project will address complications from limited to no movement ability of adults while lying in bed, including patients with ALS who have weak muscles and loss of ability to control them. The team will develop an innovative bed mattress consisting of an array of soft air cells that will autonomously pressurize and depressurize specific areas to provide site-specific pressure relief, tilted repositioning and assistance with transferring while the patient is lying flat or has their head elevated.

AUTOMATED ANEURYSM SEGMENTATION AND MEASUREMENT

Collaborators: Dr. Jeff Klopfenstein, OSF HealthCare and Thomas Huang, U of I Beckman Institute for Advanced Science and Technology

Cerebral aneurysms are among the most deadly types. This group will build a large-scale dataset to create an algorithm to identify and segment the bulging blood vessels based on size and blood flow. This will be used for future medical imaging instruction and to develop computer programs to help with treatment decisions.

DESIGN AND VALIDATION OF A SOFT ROBOTIC CARDIAC TRANSSEPTAL PUNCTURE SIMULATOR

Collaborators: Dr. Abraham Kocheril, OSF HealthCare and Girish Krishnan, U of I Grainger College of Engineering

This project continues work on a realistic soft heart simulator that allows early-career cardiologists and surgeons to feel what it’s like to poke and prod cardiac tissues during a common surgery for patients with an irregular heartbeat. Phase II will enhance the level of realism by fine-tuning the materials used and incorporating image-based guidance.

DEVELOPMENT OF A DIGITAL FALL RISK ASSESSMENT AND PREVENTION TOOL FOR RURAL OLDER ADULTS

Collaborators: Dr. Sarah Stewart de Ramirez, OSF HealthCare and Jacob Sosnoff, U of I Beckman Institute for Advanced Science and Engineering

Falls are the number one cause of accidental injury in older adults. This project will use a machine learning algorithm for a fall risk assessment and prevention strategy application as part of a community health worker's digital toolkit. Researchers will also assess the usability of the “Steady” tool.

DIGITIZING THE NEUROLOGICAL SCREENING EXAMINATION

Collaborators: Dr. Christopher Zallek, OSF HealthCare/UICOMP and George Heintz, U of I Health Care Engineering Systems Center

There’s a projected 19% shortage of neurologists nationally by 2025 and yet nine percent of primary care visits are with patients who have neurological issues. This project will pilot an integrated Digital Neurological Examination (DNE) system and develop a platform using data for an AI-informed decision support assistant. The assistant will help physicians triage and care for patients with neurological symptoms regardless of exam location.

IMPROVING FEEDBACK AND EFFICIENCY: AUTOMATED GRADING OF POST SIMULATION WRITTEN CHART NOTES

Collaborators: Dr. William Bond, OSF HealthCare and Suma Bhat, U of I Grainger College of Engineering

Immediate feedback fosters the best learning and this project aims to improve Automated Short Answer Grading (ASAG) using Natural Language Processing (NLP) methods from previously collected and graded chart notes following simulations using standard participants (actor-based simulations). The tools developed will also reduce faculty grading demands and can be applied to trainings for other topics including use of opiates, telehealth use and patient counseling.

IMPROVING OUTCOMES AND TRAINING OF PECTUS EXCAVATUM

Collaborators: Dr. Paul Jeziorczak, OSF HealthCare and Inki Kim, U of I’s Grainger College of Engineering

This team will develop a process using virtual and augmented reality to improve patient education, resident training and placement of an internal metal chest brace for patients with pectus excavatum or sunken chest which can impact the function of the heart and lungs. The team will build on work already done with pediatric hearts, and build a training model using 3D printed chest walls as well as a virtual reality module for self-study as well as pre-operative planning.

OPTIMIZING DEPLOYMENT OF COMMUNITY HEALTH WORKERS

Collaborators: Dr. Sarah Stewart de Ramirez, OSF HealthCare and Hyojung Kang, U of I College of Applied Health Sciences

Community Health Workers are effective for improving health and lowering health care costs for vulnerable populations, such as those living in rural areas where access to health care is limited and health outcomes are poor. The project will create data-driven algorithms to support optimal deployment of precision guided, digitally enabled CHWs in rural settings.

SKILL ASSESSMENT IN SURGERY AND MICROSURGERY

Collaborators: Dr. Heidi Phillips, U of I College of Veterinary Medicine and T. Kesavadas, U of I Health Care Engineering Systems Center

We propose applying advanced engineering and data science to develop a high-fidelity virtual simulator to provide thorough and validated microsurgical training and assessment. The team will develop an evidence-supported, automated, robust, real-time, comprehensive and quantitative (ARRCQ) assessment system by building data sets and creating algorithms for optimum learning including accuracy and cost.

VIRTUAL REALITY TO DELIVER PSYCHOTHERAPY TO LUNG CANCER PATIENTS WITH DEPRESSION

Collaborators: Dr. Rhonda L. Johnson, OSF HealthCare and Rosalba Hernandez, U of I School of Social Work

More than half of all lung cancer patients experience depression which impacts their compliance with treatment, increases hospitalization and ultimately decreases survival rates. With a shortage of psychotherapists across the country, especially in rural areas, this project’s virtual reality (VR) platform could fill the void. For example, VR programs could transport users to relaxing environments with guided meditation. If successful, this treatment could be used as patients receive chemotherapy before or after radiation.

Fall 2019

Using Simulation to Evaluate and Improve Team Cognition in Handoffs

Collaborators: Abigail Wooldridge, Illinois/Industrial and Enterprise Systems Engineering and Paul Jeziorczak, OSF

This project is the continuation of earlier research. It attempts to better measure the impact of improvements made to the process of handoffs which are important to provide opportunities to detect and correct errors. Recent work has conceptualized handoffs as team cognition, measured using human factor techniques outside of health care. Researchers believe team cognition theory can be applied to improve handoffs with education and technology-based interventions.

Lung Cancer Radiomics and Radiogenomics

Collaborators: Minh N. Do, Illinois/ Coordinated Science Laboratory and Joseph R. Evans, OSF/UICOMP

In an attempt to reduce the leading cause of cancer deaths in the United States, this project would combine imaging and genomic features to develop a radiogenomics risk signature, offering valuable information about the aggressiveness of the newly diagnosed lung cancer. Furthermore, this project takes advantage of and extends the OSF lung cancer screening program by establishing IRB-approved imaging and pathology repositories.

Mixed-Reality Based Visualization and Simulation of Nerve Conduction Study

Collaborators: Vahid Tohidi, OSF and Pramod Chembrammel, Illinois/Health Care Engineering Systems Center

This proposal attempts to use a mixed-reality technology platform to train medical students, technicians, neurology residents and fellows how to better recognize pathological patterns in results of nerve conduction studies. Researchers believe this type of education will shorten the learning curve for accurate and effective application of NCS data in diagnosis of peripheral nerve disorders which can be debilitating for those impacted.

Surgical Planning Via Preoperative Surgical Repair of Next-Generation 3D, Patient-Specific, Cardiac Mimic

Collaborators: Hyunjoon Kong, Illinois/Bioengineering and Mark D. Plunkett, OSF

This project aims to 3D print realistic physical organs and tissues to help surgeons better plan for specific operations and train new surgeons. This team has developed a 3D printing approach, using materials that mimic the softness and toughness of anatomy. This work is expected to advance the field of clinical simulation to the next level.

i-AREA-p: An Intelligent Mobility-Based Augmented Reality Simulation Application for Pediatric Resuscitation Training

Collaborators: Trina Croland, OSF/UICOMP and Abigail Wooldridge, Illinois/Industrial & Enterprise Systems Engineering

Jump Simulation created an augmented reality-based Pediatric Code Cart app that allows medical students and professionals to easily learn about the contents of the cart, how it works, and how to use it in the event of a pediatric emergency. This team will work to expand this platform to include additional adult resuscitation modules as well as procedural skills elements related to pediatric resuscitation.

Robotic Arm Neurological Exam Training Simulator for Abnormal Muscle Tone

Collaborators: Elizabeth Hsiao-Wecksler, Illinois/Mechanical Science and Engineering and Christopher Zallek, OSF/UICOMP

This group of individuals is expanding work to create multiple robotic arm simulators that mimic abnormal muscle behaviors. These training devices are expected to help medical students, interns, residents, nurses and physical/occupational therapists understand the difference between spasticity and rigidity in patients to correctly diagnose neurological conditions.

Pediatric Sepsis Guidance System

Collaborators: Lui Raymond Sha, Illinois/Computer Science and Richard Pearl, OSF/UICOMP

In an effort to help clinicians diagnose sepsis in pediatric patients sooner, this team is creating a computerized pediatric sepsis best practice guidance system. This software will allow for early detection, diagnosis and treatment of sepsis in children. The goal is to improve patient care and reduce medical errors. It will first be tested in a simulation setting.

Multi-modal Skin Lesion Identification & Education Simulator: Augmented Reality Interactive Skin Lesion App

Collaborators: Scott Barrows, OSF/Jump and Steve Boppart, Illinois/Bioengineering

This project expands on an augmented reality-based mobile app developed last year to train medical students in the identification, diagnosis and treatment of skin lesions, masses and other abnormalities. The second phase aims to give learners the ability to see beneath the skin to view skin lesions and their pathologies that cannot be seen on the surface.

Integrating Soft Actuators in a Heart Simulator to Mimic Force Feedback in Cardiac Trans-Septal Puncture

Collaborators: Girish Krishnan, Illinois/Industrial Systems Engineering and Abraham Kocheril, OSF

This team is creating a realistic soft heart simulator that allows learners to feel what it’s like to poke and prod cardiac tissues to make crucial operating decisions. While this simulation device targets a specific surgical process for the heart, the idea is to create more soft structures for other surgical procedures.

Virtual Heart Patch for Determining Complex Shapes for Surgical Patching

Collaborators: Arif Masud, Illinois/Civil and Environmental Engineering and Matthew Bramlet, OSF/UICOMP

This group is developing a software module that allows surgeons to simulate the creation of complexly-shaped 2D heart patches in a virtual reality environment. Surgeons would use this simulation to determine the size and shape of a patch that needs cut from a 2D sheet of flexible cloth-like material that can be used in a real heart patch surgery.

Automated and adaptive whole-body segmentation for visualization of anatomy, lesions, and intervention pathways for medical training

Collaborators: Brad Sutton, Illinois/Bioengineering and Matthew Bramlet, OSF/UICOMP

This project expands on a previous effort to develop an automated segmentation program to create congenital heart defects in 3D, viewable in a variety of digital formats. The current proposal seeks to develop another automated segmentation platform for the creation of 3D content of the whole body for medical training in virtual reality.

Fall 2018


Using Simulation to Evaluate and Improve Team Cognition in Handoffs

Collaborators: Abigail Wooldridge, Illinois/Industrial and Enterprise Systems Engineering and Paul Jeziorczak, OSF
 
This project is the continuation of earlier research. It attempts to better measure the impact of improvements made to the process of handoffs which are important to provide opportunities to detect and correct errors. Recent work has conceptualized handoffs as team cognition, measured using human factor techniques outside of health care. Researchers believe team cognition theory can be applied to improve handoffs with education and technology-based interventions.
 

Lung Cancer Radiomics and Radiogenomics

Collaborators: Minh N. Do, Illinois / Coordinated Science Laboratory and Joseph R. Evans, OSF/UICOMP
 
In an attempt to reduce the leading cause of cancer deaths in the United States, this project would combine imaging and genomic features to develop a radiogenomics risk signature, offering valuable information about the aggressiveness of the newly diagnosed lung cancer. Furthermore, this project takes advantage of and extends the OSF lung cancer screening program by establishing IRB-approved imaging and pathology repositories.
 

Mixed-Reality Based Visualization and Simulation of Nerve Conduction Study

Collaborators: Vahid Tohidi, OSF and Pramod Chembrammel, Illinois/Health Care Engineering Systems Center 
 
This proposal attempts to use a mixed-reality technology platform to train medical students, technicians, neurology residents and fellows how to better recognize pathological patterns in results of nerve conduction studies. Researchers believe this type of education will shorten the learning curve for accurate and effective application of NCS data in diagnosis of peripheral nerve disorders which can be debilitating for those impacted.
 

Surgical Planning Via Preoperative Surgical Repair of Next-Generation 3D, Patient-Specific, Cardiac Mimic

Collaborators: Hyunjoon Kong, Illinois/Bioengineering and Mark D. Plunkett, OSF
 
This project aims to 3D print realistic physical organs and tissues to help surgeons better plan for specific operations and train new surgeons. This team has developed a 3D printing approach, using materials that mimic the softness and toughness of anatomy. This work is expected to advance the field of clinical simulation to the next level. 
 

i-AREA-p: An Intelligent Mobility-Based Augmented Reality Simulation Application for Pediatric Resuscitation Training

Collaborators: Trina Croland, OSF/UICOMP and Abigail Wooldridge, Illinois/Industrial & Enterprise Systems Engineering
 
Jump Simulation created an augmented reality-based Pediatric Code Cart app that allows medical students and professionals to easily learn about the contents of the cart, how it works, and how to use it in the event of a pediatric emergency. This team will work to expand this platform to include additional adult resuscitation modules as well as procedural skills elements related to pediatric resuscitation.
 

Robotic Arm Neurological Exam Training Simulator for Abnormal Muscle Tone

Collaborators: Elizabeth Hsiao-Wecksler, Illinois/Mechanical Science and Engineering and Christopher Zallek, OSF/UICOMP
 
This group of individuals is expanding work to create multiple robotic arm simulators that mimic abnormal muscle behaviors. These training devices are expected to help medical students, interns, residents, nurses and physical/occupational therapists understand the difference between spasticity and rigidity in patients to correctly diagnose neurological conditions. 
 

Pediatric Sepsis Guidance System

Collaborators: Lui Raymond Sha, Illinois/Computer Science and Richard Pearl, OSF/UICOMP
 
In an effort to help clinicians diagnose sepsis in pediatric patients sooner, this team is creating a computerized pediatric sepsis best practice guidance system. This software will allow for early detection, diagnosis and treatment of sepsis in children. The goal is to improve patient care and reduce medical errors. It will first be tested in a simulation setting.  
 

Multi-modal Skin Lesion Identification & Education Simulator: Augmented Reality Interactive Skin Lesion App

Collaborators: Scott Barrows, OSF/Jump and Steve Boppart, Illinois/Bioengineering
 
This project expands on an augmented reality-based mobile app developed last year to train medical students in the identification, diagnosis and treatment of skin lesions, masses and other abnormalities. The second phase aims to give learners the ability to see beneath the skin to view skin lesions and their pathologies that cannot be seen on the surface.
 

Integrating Soft Actuators in a Heart Simulator to Mimic Force Feedback in Cardiac Trans-Septal Puncture

Collaborators: Girish Krishnan, Illinois/Industrial Systems Engineering and Abraham Kocheril, OSF
 
This team is creating a realistic soft heart simulator that allows learners to feel what it’s like to poke and prod cardiac tissues to make crucial operating decisions. While this simulation device targets a specific surgical process for the heart, the idea is to create more soft structures for other surgical procedures.


Virtual Heart Patch for Determining Complex Shapes for Surgical Patching

Collaborators: Arif Masud, Illinois/Civil and Environmental Engineering and Matthew Bramlet, OSF/UICOMP
 
This group is developing a software module that allows surgeons to simulate the creation of complexly-shaped 2D heart patches in a virtual reality environment. Surgeons would use this simulation to determine the size and shape of a patch that needs cut from a 2D sheet of flexible cloth-like material that can be used in a real heart patch surgery.


Automated and adaptive whole-body segmentation for visualization of anatomy, lesions, and intervention pathways for medical training

Collaborators: Brad Sutton, Illinois/Bioengineering and Matthew Bramlet, OSF/UICOMP
 
This project expands on a previous effort to develop an automated segmentation program to create congenital heart defects in 3D, viewable in a variety of digital formats. The current proposal seeks to develop another automated segmentation platform for the creation of 3D content of the whole body for medical training in virtual reality.

 

Fall 2017

KneeVIEW: A Virtual Education Window for musculoskeletal training

Collaborators: Mariana Kersh, PhD, Scott Barrows, MA, FAMI, Dr. Thomas Santoro, David Dominguese, PhD, Anthony Dwyer, Joel Baber, Grace I-Hsuan Hsu, B.Sc., ALM, MS, Meenakshy Aiyer, MD, FACP
 
Despite the increasing prevalence of orthopedic injuries, clinicians are poorly equipped to treat musculoskeletal problems. Musculoskeletal training is ineffective due to limited exposure to clinical patients resulting in a lack of organized clinical instruction. This project aims to develop a realistic knee simulator model, supported by virtual reality and augmented reality educational modules, to enhance clinician training and improve patient outcomes. The biomechanically accurate model will replicate the stiffness of each individual component of the human knee to simulate both normal and pathological cases.
 

Multi-modal Skin Lesion Identification & Education Simulator

Collaborators: Scott Barrows, MA, FAMI, Stephen A. Boppart, M.D., Ph.D., Thomas Golemon, MD, Brent Cross, BS, MS
 
Current simulated skin and models of skin lesions used in education are unrealistic in both visual and tactile characteristics. This project aims to create a skin simulation model with realistic appearance and texture. In the project’s first phase, the model will consist of 2D surface images of skin lesions displayed on a tablet computer with a translucent elastomer overlay replicating the surface topography of the lesion. Future efforts will seek to extend the model to 3D and incorporate additional features.
 

Interactive Technology Support for Patient Medication Self-Management (continued funding)

Collaborators: Dan Morrow, PhD, Suma Pallathadka Bhat, PhD, Mark Hasegawa-Johnson, PhD, Thomas Huang, BS, MS, ScD, James Graumlich, MD, Ann Willemsen-Dunlap, PhD, Don Halpin, EMBA, MS
 
Electronic health record (EHR) systems are underutilized by chronically ill adult patients. A barrier to patient/provider collaboration and self-care via EHR systems is that information in EHRs is technical, not patient-specific. This project aims to develop a natural language processing tool to translate technical information in the EHR into patient-centered language. A prototype translation algorithm has been created, with preliminary results showing the translation is both accurate and easier to understand. Development of a conversational agent (CA) system using an animated avatar to deliver the patient-centered language is also underway. Goals for further development are refinement and expansion of the translation tool and CA capabilities, including making the CA interactive and able to ask and respond to questions.
 

AirwayVR Virtual Reality based trainer for Endotracheal Intubation

Collaborators: Pavithra Rajeswaran, Praveen Kumar, MBBS, DCH, MD, Eric Bugaieski, MD, Priti Jani, MD, MPH
 
Endotracheal intubation is a procedure with risks of severe complications; this risk has been found to be associated with experience.  This project seeks to develop a stable, immersive, high quality, low cost VR simulation trainer for learning and practicing intubation. It will create a curriculum for intubation training that uses a VR trainer featuring 3D models of the head and neck and other interactive learning tools. VR input will be provided by a 3D printed laryngoscope as a VR controller. Validation studies will be performed to assess the impact of the VR trainer in intubation training.
 

Simulation Training for Mechanical Circulatory Support using Extra-Corporeal Membrane Oxygenation (ECMO) in Adult Patients (continued funding)

Collaborators: Pramod Chembrammel, PhD, Matt Bramlet, MD, Pavithra Rajeswaran
 
Widespread adoption of extra-corporeal membrane oxygenation (ECMO) in adults is limited by the difficulty of deployment of cannulae. To address this deficiency, this project aims to build a physical simulator for training ECMO. The trainer will use customized mannequins with flexible vasculature and a programmable pump to simulate the circulatory system. This artificial vasculature will be integrated with the BioGears physiology engine to control simulated physiological parameters. The physical components will be manufactured by 3D printing. Simulation experts and ECMO-experienced surgeons will evaluate the simulator’s performance.
 

A Natural Language Powered Platform for Post-Operative Care for Long Distance Caregiving

Collaborators: Ramavarapu Sreenivas, MS, PhD, Sarah De Ramirez, MD, MSc, Kesh T. Kesavadas, PhD
 
A 2011 study found that patients with severe postoperative grade IV complications cost the US $159,345. This projects aims to diminish these costs by using a Natural Language Powered Platform that patients can verbally interface with. The project consists of three phases: coding voice-commands to fulfill postoperative protocols and test it in a VR environment, connecting the platform to sensors to see if it can process motion assessments and testing these in the VR environment, and conducting studies with test patients at OSF.
 

Heart Failure & Behavior Change: Patient/Provider Interactive Clinical Education App for Mobile Devices

Collaborators: Scott Barrows, MA, FAMI, Wawrzyniec, Dobrucki, MS, PhD, Barry Clemson, MD, Kyle Formella, Don Halpin, EMBA, MS, Ann Willemsen-Dunlap, PhD
 
Heart failure (HF) is a complex physiological ailment that requires high cost interventions to manage. However, it has been shown that clear communication during the process improves patient outcomes and decreases human and financial burdens. This study aims to use a mobile app to support patients with Stage A, B, and C of HF. The aims of this project are to use a literature search and needs analysis to determine gaps and barriers, revise and add interactive 3D visual assets for the application, develop a repository of information of HF to be housed in the app, and begin integration of conversation agents developed through previously-funded ARCHES projects. Desired outcomes for the project are improved communication and understanding of HF for patients and improved adherence to treatment by patients.
 

Flexible, low-cost, Single Port Minimally Invasive robotic Surgical Platform

Collaborators: Placid Ferreira, PhD, Kesh T. Kesavadas, PhD, Nicholas Toombs, Fanxin Wang, Xiao Li, Jorge Correa
 
Minimally invasive robotic Single Port Laparoscopic Surgery (SPLS) has allowed for surgeons to perform various complex procedures with less burden on the patients. The downside to these robotic systems is an increased economic, maintenance and operation burden, resulting in limited hospital access. This project aims to improve upon their SPLS prototype to develop a cheaper, portable and more flexible device to address those issues. Three advancements in the field with the prototype have been demonstrated. Adding three more improvements will increase the adaptivity of the device and lower the price to an affordable point for middle class hospitals.
 

Interactive Mixed Reality (IMR) based Medical Curriculum for Medical Education

Collaborators: Kesh T. Kesavadas, PhD, David Crawford, MD, Meenakshy Aiyer, MD, FACP, Jessica Hanks, MD, John Vozenilek, MD
 
Clinical education and training is a highly complex area, and strides have been taken to improve upon the pre-existing methods of teaching. This project aims to combine the strengths of Jump and HCESC to develop a highly interactive platform for learning that uses Interactive Mixed Reality, a combination of Virtual Reality and 360-degree video. The hope is to eliminate the barrier of the simulation technical skillset so that instructors can easily develop educational content. Future goals of the platform are to provide an easy, immersive and portable method for adult professional learners to maintain, acquire and improve current knowledge while maintaining communication between them and healthcare education centers.

Summer 2017

Simulation of postural dysfunction in Parkinson’s disease

Led by: Manuel Hernandez from U of I, Dronacharya Lamichhane, MD from OSF HealthCare and UICOMP and Richard Sowers from U of I.
 
Falls are a prevalent and significant problem in people with Parkinson’s disease that is associated with gait and balance impairment. Balance impairment in Parkinson’s disease and the unique contributions from anxiety are poorly understood and difficult to treat.
 
This team is using a unique test of balance to gain a greater understanding of the coordinated activity of the body and brain, the disruption of this coupling that results from Parkinson’s disease and the influence of dopaminergic therapy.
 
Using virtual reality, this work will provide health care practitioners with a new tool for use in long-term monitoring of disease progression and drug treatment efficacy relevant to a wide range of motor disorders. In addition, it will serve as a platform for simulating the effects of altered sensory and motor integration function to the health care practitioners of tomorrow.
 

Movement impairment characterization and rehabilitation for dystonic cerebral palsy using robotic haptic feedback in virtual reality

Led by: Citlali Lopez from U of I and Julian Lin, MD from OSF HealthCare and UICOMP.

Cerebral palsy (CP) is the most common movement problem in children. 10% of children with CP have dystonia and seek medical assistance at higher rates than other forms of CP. Dystonia is a movement disorder with involuntary muscle contractions the cause twisting and repetitive movements, abnormal postures, or both. There is no cure for dystonia and rehabilitation exercises are unknown.

The team working on this project is developing a non-invasive, game-like intervention for patients with dystonic-CP using virtual reality and haptic feedback. The goal is to improve clinical motor scores.

This game-like tool will also double as a training implement for medical practitioners in the identification of complex presentations of motor disorders, not limited to CP.

Summer 2016

Virtual reality system of Patient Specific Heart Model medical education and surgical planning

Led by: Lavelle Kesavadas from U of I and Matthew Bramlet, MD from OSF HealthCare and UICOMP

Currently, doctors are using 2D tools and images to visualize a child’s 3D heart and make important surgical decisions. Because of the complex intra- and extra-cardiac relationships and connections, this imperfect method makes it difficult for doctors to accurately diagnose a patient. Researchers at the Health Care Engineering System Center at U of I and Jump Simulation, a part of OSF Innovation are using 3D immersive virtual reality technology to help solve this problem. They have created an intuitive model generated from patient-specific MRIs using stereoscopic 3D head-mounted displays.

Spring 2016

Safety and Reliability of Surgical Robots via Simulation

Led by: Ravishankar Iyer from U of I and David Crawford, MD from OSF HealthCare and UICOMP

In 2015, researchers at Illinois, MIT, and Rush University Medical Center reported that surgical robots had caused 144 deaths in 14 years. Now, computer engineers at Illinois and surgeons at OSF Saint Francis Medical Center in Peoria are collaborating on new research to improve the reliability and safety of minimally invasive robotic surgery.

This research will create platforms for simulation of realistic safety-hazard scenarios in robotic surgery and develop tools and techniques for the design and evaluation of the next generation of resilient surgical robots. The work will help improve not only the safety of robotic surgical systems, but also simulation-based training of future surgeons.
 

Patient Discharge Process and Communications Simulation Training

Led by: Deborah Thurston from U of I and Richard Pearl, MD from OSF HealthCare and UICOMP

About 20-25% of patients discharged from hospitals are readmitted within 30 days, costing roughly $42 billion dollars per year to insurance providers, according to the Agency for Healthcare Research and Quality. These costs are now the responsibility of Accountable Care Organizations (ACOs) like OSF HealthCare.

In some cases, patients are discharged too soon or with inappropriate treatment. Or patients may not understand and/or comply with discharge instructions such as how they are supposed to take their medications and what levels of activities they are able to do. There are a variety of proposed tools and techniques available to reduce readmissions, but there is no holistic system addressing the issue.

A framework is being developed, as part of ARCHES funded research, which will help define the complexity of the total patient discharge system and allow hospitals to evaluate new technology, policy, and communication systems in the construct of training simulation strategies.
 

Simulation training for mechanical circulatory support using extra-corporeal membrane oxygenation (ECMO) in adult patients

Led by: Pramod Chembrammel from U of I and Matthew Bramlet, MD from OSF HealthCare and UICOMP

This team is developing a simulator to train surgeons in using extra-corporeal membrane oxygenation (ECMO) to provide artificial oxygenation to blood cells. This skill, which is difficult to perfect without practicing on real patients, helps save failing hearts and lungs during a surgery. The researchers are modifying the DR DopplerTM blood flow simulator, which simulates blood flow in the vasculature, to develop a working prototype where the blood flow changes colors based on oxygenation.

Download PDF Fact Sheet

Simulation Training to Identify Fall Risk in the Home Environment

Led by: Rama Ratnam from U of I and Julia Biernot, MD from OSF HealthCare

Falls are a leading cause of serious injury and death in the elderly. There is a need to find a cost-efficient and easy means of evaluating fall risks, identifying muscle weaknesses, and establishing the potential for loss of balance in the home. Further, there is an equal need to train clinicians to evaluate elderly patients at risk for falling, and to better identify fall risk from postural and movement analysis.

Engineers with the U of I have developed a home-based tele-rehabilitation system that is inexpensive and capable of accurately recording and analyzing posture and balance during movement transitions. Researchers will test the validity of this system against a standardized method of determining fall risk.

The goal is for the system to allow for targeted intervention in an individual’s home and to better train clinicians in fall risk assessment, offering unparalleled opportunities to examine body dynamics in great detail and better understand postural control.

Fall 2015

Abnormal Muscle Tone Behavior Diagnostic Device - Year 2

Led by: Elizabeth Hsiao-Wecksler from U of I, Steven Tippett from Bradley University and UICOMP, Randy Ewoldt from U of I and Dyveke Pratt, MD from OSF HealthCare

This project will create a novel robotic training simulator that will helps learner differentiate between abnormal muscle tone behaviors which can help diagnose different brain lesions such as stroke, Parkinson’s disease, cerebral palsy, or multiple sclerosis.
 

Personalized Avatars In Patient Portals

Led by: Thomas Huang from U of I and Ann Willemsen-Dunlap from OSF HealthCare

A 3D audio-visual avatar capable of showing appropriate emotions as controlled by health care providers that will be used in online patient portals to help patients understand their specific medical information, such as test results and medical guidance.

Projects Archive Current