Tag Archives: augmented reality

Closing in on Anatomic Replication

Written on January 5, 2017 at 8:00 am, by

The purpose of medical imaging from the very beginning was to figure out ways to look inside the body and learn what’s going on structurally and physiologically. To that end, physicians used x-rays or performed exploratory surgeries for decades to identify disease or injury. Then came the ultrasound in the 1960s that gave clinicians real-time images of internal body structures using sound waves. Imaging techniques progressed even further in the 1970s with the advent of CT scans and MRI, which are both commonly used today.

Dr. Matthew Bramlet discusses importance of anatomic replication at AHA Heart Innovation Forum. It’s my belief that 3D modeling will be the next critical tool used by physicians to not only diagnose, but improve surgical planning, patient outcomes and the education of future clinicians. It has the power to essentially produce exact replications of soft tissue structures, improving understanding among doctors and patients alike. But first, it will take collaboration across the U.S. to make this a reality.

I recently spoke at the American Heart Association-Midwest Affiliate’s Heart Innovation Forum to advocate for imaging techniques that lead to anatomic replication. The Advanced Imaging and Modeling (AIM) team at Jump Simulation has come up with a semi-automated process to convert CT and MRI scans into 3D digital images that can be printed or integrated into virtual environments like augmented and virtual realities (AR and VR). What we’ve learned is that these nearly perfect 3D surrogates of anatomy can’t happen without working to create quality images from the start.

Garbage In, Garbage Out

The old adage “garbage in, garbage out” applies directly to 3D modeling. The standard across the nation for the last ten years has been to quickly produce images that might not have the best quality but lead to diagnosis in an efficient and productive manner. The ability to print or view these images in three-dimensions, though, requires a little more time and effort but leads to discoveries we’ve never seen before.

There is a quality standard that must be met each step along the continuum of 3D modeling translation. If the image is poor – fail. If the segmentation is poor – fail. If the print is poor – fail. If the VR translation is poor – fail. The focus of our cardiovascular imaging efforts at OSF HealthCare is to generate the highest quality images we can attain.

Most recently, we sent a quality focused 3D heart digital file to the incredible engineers at Caterpillar’s additive manufacturing lab. They have a printer that allows us to produce a heart in a soft enough material that can be cut with a scalpel, allowing surgeons to effectively practice on a patient’s heart before surgery. The result was incredible. Not only were we able to practice the surgery before the operation, but we were able to see anatomic detail like never before seen, prompting an entirely new set of possibilities where 3D printing could potentially improve patient care.

Making a Case for High-Quality Imaging Standards

There are many physicians around the U.S who understand the impact 3D modeling can have on surgical planning, patient outcomes and the education of future clinicians. In fact, a group of us are working with the National Institutes of Health and the American Heart Association to create accuracy and quality standards for the Jump Simulation-curated 3D Heart Library, an open-source digital repository of hearts with congenital defects on the NIH 3D Print Exchange. However, I recognize there are still some skeptics out there who don’t understand the value of this technology.

My experience with these models has been that they give surgeons a point of reference they haven’t had before, giving them the ability to make informed decisions before operating on patients. They make viewing anatomical images intuitive across all medical specialties. 3D models give patients and their families a better understanding of procedures they may have to undergo. They also allow educators to easily explain different types of congenital heart disease and what they look like to physicians looking to master the skill of diagnosis or surgery.

Physicians are busy and it’s difficult to put the time and effort into higher quality imaging. However, doing so leads to exact anatomic replications and, in my opinion, is the next big jump in medical imaging surrogacy. It’s going to take clinicians making medical decisions or planning surgery to be impacted by this for the advocacy to come through the clinical community.

Advancing the Visualization of Human Anatomy

Written on July 22, 2016 at 7:05 am, by

What if medical imaging and visualizing human anatomy were as easy, immersive, and intuitive as playing a video game? The latest project taken on by the Advanced Imaging and Modeling (AIM) program at Jump is working to achieve just that. AIM has nearly perfected the refined, manual process of converting traditional medical images (MRI and CT scans) into digital formats, allowing doctors and surgeons to interact with physical three-dimensional anatomic models for medical decision making, pre-surgical planning, and patient education.

UICOMP_JUMP_heart_innov_lab_etc_20140313_102In fact, hospitals from coast to coast are sending us medical images of congenital heart defects and other abnormal conditions for 3D printing. When medical decision making is involved, we provide this service for free in hopes of giving physicians and surgeons improved understanding of each patient’s clinical scenario. The ultimate goal is to utilize intuitive 3D tools to improve the clinical outcomes and standardize interpretations of anatomical images across all clinical specialties.

Now, AIM is taking digital formats of medical scans and plugging them into evolving visual technologies, eliminating the need for a physical model and shrinking the time it takes to view a complete 3D image. We believe virtual and augmented reality will revolutionize how radiologists and clinicians look at anatomy.

Physical 3D Models Vs. Digital Models

Just as there was not an existing process for converting medical images into a digital format, there was also not software to translate imaging data to be compatible with virtual environments like augmented and virtual realities. Our engineers at Jump have created a prototype program to view 3D images of hearts and other parts of the human anatomy for the HTC Vive, a virtual reality headset.

What we’ve found is that physical 3D representations of anatomy (3D printed content) can only be viewed in so many different ways. With augmented and virtual reality, there’s more flexibility to immerse oneself into the entire image, and expand viewing capabilities. It’s also scalable because I can send a video file much quicker than I can ship a physical 3D model.

The AIM team recently ran test case scenarios with several different surgeons to collect feedback on their experiences and whether they found viewing anatomy with the Vive beneficial.

“As one who does not have a background in video games and this type of technology, the HTC was fairly easy to grasp,” said Dr. Karl Welke, a pediatric congenital heart surgeon with Children’s Hospital of Illinois. “It puts you into a more intimate relationship with the heart and you can manipulate it in ways that you can’t do with a typical medical scan. As a surgeon, I can understand what I’m going to see in the operating room in much greater detail than I could before.”

Other surgeons found there to be educational value in using the Vive to view human anatomy. “I could have surgical residents view virtual models of cancer and ask them to practice removing tumors before going into surgery,” said Dr. Richard Anderson, thoracic and cardiac surgeon. “This would give residents more experience before operating on a real patient and prevent errors in the future.”

The Future of Medical Imaging

We see a future state where clinicians are truly immersed inside this environment, and interact with it in ways they were unable to before. We also see the potential for bringing people together inside virtual and augmented realities for group learning and comprehension.

I’ve been doing pediatric cardiac congenital MRI for nearly nine years, and I strongly believe that viewing medical images using immersive, visual technologies is not a fad. It’s the direction of the future. We are putting tools in clinicians’ hands that they didn’t realize they needed and helping them fulfill their potential in medical decision making with higher efficiency and quality so that we can improve health care.

Mobile Technology in Healthcare Education

Written on June 30, 2016 at 7:33 am, by

Fig 1- Screenshot of iMed concept between Steve Jobs and Scott Barrows. (2003)

Steve Jobs never stopped searching for new ideas. He always wondered how technology can seamlessly interface with the “human experience.”

In 2002, I had an opportunity to collaborate and trade thoughts with Steve on this concept as it relates to the human body. We had a working title of “iMed” (Figure 1) and now much of that early vision is captured by mobile devices like the iPad Pro, wearable devices like the Apple Watch and Fitbit, and through portable versions of 3D virtual glasses and goggles (from products like the high-end HTC Vive to the inexpensive Google Cardboard and variations that use a smartphone).

Although the total number of medical apps (iOS) make up only 1.99% of the total number of downloads from the App Store, when lifestyle and health/fitness are added, that number expands to 13.67% to all iOS downloads. Interactivity, personalized health data availability, game-like engagement, and a focus on “edutainment” make the vision of technological integration more dynamic than ever.

New Learning Styles & Apps May Help Change Unhealthy Behaviors

Fig 2- Variety of mobile technology used in learning and interactivity

Mobile, interactive, and wearable technologies have altered how we learn, and the impact in healthcare is enormous. Social media, game-play platforms, collaborative e-community based learning, immersive programs, simulation apps, built-in biosensors, virtual reality, augmented reality, artificial intelligence, and peer-to-peer learning all have promise to help better educate patients and health professionals. (Figure 2)

Some of the most difficult challenges in healthcare are unhealthy behaviors from foods we eat, overeating, smoking, lack of exercise, and a lack of motivation to change. Centers for Disease Control and Prevention statistics in 2014 indicate that up to 40% of annual deaths from each of the five leading causes of death in the U.S. are from “modifiable risk factors”. Habit formation develops in the basal ganglia of the brain and once established, may take weeks or months to change. Most programs and apps fail to make much of an impact so far in changing bad habits.

The good news is that more effective proven methods such as Prochaska’s stages of behavior change (pre-contemplative, contemplative, preparation, action, and maintenance) are now being integrated into personal mobile apps that merge personal lab results and up-to-the-minute recording of lifestyle data in a dynamic network. Additionally, some apps include live contact with health counselors who assist in answering

Fig 3- Wellness app (screenshot) we created for behavior change, includes patients biomarkers that can predict future disease.

questions and providing personalized support. (Figure 3)

The challenges are steep and while 10% of Americans report they own an activity tracker, 50% report they do not know how to use them and 33% say they stop using them after six months. Better technology is warranted. As biosensors improve, wearable technology will provide not only valuable data but real-time feedback to help patients better manage their health.

New Mobile Technology & Platforms in Healthcare Education

Jump Simulation is at the forefront with many of the new integrated technologies in medical education and innovation. Next year, the campuses of the University of Illinois College of Medicine will launch a new curriculum in human anatomy and construct a state-of-the-art anatomy lab in Peoria at UICOMP. The vision is to blend the best in traditional dissection with new technological resources.

One excellent resource may be Complete Anatomy, which recently won the coveted 2016 Apple Design Award in healthcare. This highly interactive mobile app combines much of the information and visualization contained in books like Gray’s Anatomy along with the interactivity and endless editable views of an iPad. Complete Anatomy is designed for the advanced iPad Pro technology and features numerous editable functions using the Apple Pencil that provide endless cross-section views, selective anatomical structures and systems, collaborative interaction, and recording functions. Complete Anatomy is one of the finest healthcare mobile apps on the market. (Figure 4)

Fig 4- Screenshot from Complete Anatomy iPad AppAnother outstanding interactive medical app is Touch Surgery. Touch Surgery provides a guided, interactive experience that simulates specific surgical procedures in a step-by-step process. The ever expanding library of surgical procedures in many specialty areas is an excellent approach that utilizes visual learning and experiential activity. (Figure 5)

Fig 5- Screenshot from Touch Surgery app (mobile device surgical simulator)

With the explosion of virtual reality (VR) and augmented reality (AR) capabilities, 3D interaction in healthcare provides immersive explorations inside of the human body. When combined with patient 3D scans from CT and high res MRI, clinicians can look deeper and in endless dimensions at specific structures. High-end goggles like the Oculus Rift and HTC Vive offer exquisite high definition interaction and even low-end devices like Google Cardboard and others utilize mobile smart phone technology to give a “3D” experience.

Other interactive mobile programs and technologies are used in health professions education, like Health Scholars. A Peoria-based educational series developed by the nurses at OSF HealthCare in conjunction with SIMnext and CSE Software Inc. The interactive, self-contained learning encounter for nurses is built on a mobile, tablet-based platform created to help standardize nurse education. It is a series of self-directed learning modules that take education from the classroom to mobile sites, including the hospital floor.

Mobile and app-based learning will continue to evolve, and researchers, faculty, and staff at Jump are actively involved in the exploration. These new technologies will impact education and improve patient care, safety, and reduce costs. Steve Jobs would approve but reminds us that much more needs to be done. Fortunately, much of that work is in development at Jump.