Using Technology to Aid Decision-Making?

Watch Out for Hidden Bias.

In healthcare, we frequently create and implement new tools, technologies, and processes to make patient care safer and more efficient and effective. But the creation of such improvements is only half the equation for success. A piece of equipment or software may be perfectly designed to overcome an existing challenge, but its success depends on how people interact with it.

James Schreiner, a PHD graduate from University of Illinois College of Engineering at Urbana-Champaign and recipient of Jump ARCHES funding, examined the human choices and judgment used with a patient re-admittance risk assessment tool OSF HealthCare developed for its patient case managers.

Case managers create individual patient discharge plans in order to maximize the patient’s recovery and prevent hospital readmission. Some patients require detailed communication plans, technology support, and follow-up from medical staff. Others can go home with minimal to no instructions or follow-up based on the factors affecting the complexity of their case.  The patient re-admittance tool used by OSF aggregates factors such as the patient’s age, history, home support structure, and reason for hospital stay and presents them to the case manager in an organized and prioritized manner.

Creating Unintended Bias

The patient discharge planning tool was designed to complement, rather than replace, the case manager’s judgment. However, Schreiner’s research found there was a correlation between case managers’ level of experience and their likelihood to accept discharge plan recommendations from the tool. Inexperienced case managers relied heavily on the tool, while more senior case managers used it as a supplement to their experience and judgment.  In other words, inexperienced managers were biasing their plans according to the tools recommendations and were considerably less likely to deviate from it, regardless of the case specifics. This unintended bias could cause a less optimal assessment in the design of a patient’s discharge.

Countering the Bias

To counter this bias, Schreiner suggests using known cognitive psychology and organizational dynamic methods to help de-bias the use of these tools. For example, just informing case managers that the tool could, and probably would, subconsciously bias them actually starts to de-bias them.

OSF is currently assessing the use of simulations for addressing the issues Schreiner identified. Simulation is a proven and useful education platform to train case managers. Its use would allow learners to experience the technology in a safe environment where we can vary the situations case managers will encounter and provide them feedback on their decision-making, helping them gain experience and develop judgment.

Simulation also allows for group learning, which utilizes the collective knowledge and experiences of the learners and facilitators to enhance the overall understanding and best practices of technology utilization.

Ultimately, all decision-support technologies we employ can unintentionally bias the user. Training, especially in realistic simulations, can help provide the mental model needed to employ better judgment for using, rather than relying on, new technologies.

Categories: Applied Research, Clinical Simulation, Culture of Safety, Engineering, Health Care Engineering Systems Center (HCESC), Jump ARCHES, Research, Research, University of Illinois (U of I), University of Illinois College of Medicine at Peoria (UICOMP)