Researchers explore how and where clinical voice assistants can provide useful services for the medical field.

Imagine the convenience of having the medical equivalent of Amazon’s Alexa to enable clinicians to issue voice commands, such as “Refer this patient for physical therapy” or “Send that prescription to the pharmacy.”

Yaa Kumah-Crystal, M.D., an assistant professor of biomedical informatics at Vanderbilt University Medical Center, is working to make possible this advance in the field of natural language processing by teaming with Epic to create a voice-activated assistant for medicine that goes by the name Hey Epic!

“Our goal in this partnership with Epic is to see if verbal commands to the virtual assistant can help to optimize workflows,” Kumah-Crystal said. “The idea is for a physician to say, ‘order a cholesterol test’ as if they were speaking to a colleague and have the voice assistant understand them and do it.”

More than 150 other academic medical centers in the United States are participating in development and testing of the Hey Epic! system. Kumah-Crystal anticipates that using the device as an everyday part of the daily workflow will help reduce the time providers spend in EHRs and increase their engagement with patients.

Eyes on the Patient

Employing natural language processing means that a physician’s eyes don’t have to be glued to a computer screen as they manipulate drop-down menus and click boxes to make their selections. The doctor’s interactions with the voice assistant can be fluid and natural and allow the physician to stay focused on the patient, according to Kumah-Crystal.

“It will be a huge win if we can get the system to be useful to clinicians when they’re seeing inpatients or are on-call.”

She recalled one patient who struggled with her weight.

“I suggested she see a nutritionist, and then told the computer, ‘Order a nutrition referral,’ without turning away from the patient or touching my keyboard,” Kumah-Crystal said.

 “My patient realized: ‘The doctor heard me and is taking action to help.’ And she thanked me.”

Adaptability is Key

Medical voice assistants may also have the potential to serve patients, nurses, and other healthcare providers in various ways.

“An inpatient could ask for dimmer lights or for white noise or even get additional information about their diagnosis or their medication,” Kumah-Crystal said. “This could take some of the burden off nurses.”

The first tests of the Hey Epic! voice assistant are taking place in outpatient settings, which have fewer emergency situations than the hospital floor.

“It will be a huge win if we can get the system to be useful to clinicians when they’re seeing inpatients or are on-call,” Kumah-Crystal said.

Addressing Error Rates

Kumah-Crystal and her voice research team are examining success and error rates for the Epic-assisted interactions, including whether the voice instructions are transmitted accurately and how doctors might tolerate repeating the instructions if more clarity is required.

“Provider acceptability will be the biggest issue. Will providers want to put their names on the notes the system creates?”

“People have more patience with other people. If someone gets something wrong, you give them grace,” Kumah-Crystal said. “Right now, the machine doesn’t learn quickly enough from its mistakes, so it might invariably misinterpret a command and frustrate a user. Future functionality should allow users to train the system so it can get smarter with every interaction. ”

Note-taking Simplified

Kumah-Crustal is also launching a pilot study to explore natural language processing for clinical notes. Twenty medical professionals at Vanderbilt from 13 specialties are participating. The study involves training the natural language processing system to accurately record different users’ voices and adapt to their preferred note-taking style, for instance, bulleted lists versus prose.

“Voice assistants should be invisible, and people should be able to trust that using them will pay off.”

“Providers are particular about their notes. Notes are their work product and how they present themselves,” Kumah-Crystal said. “We’re trying to understand what they need and how to provide it.”

In some subspecialties, such as orthopedics, conversations tend to follow similar, predictable lines, she explained. In others, including endocrinology, conversations exhibit greater heterogeneity.

Will Doctors Use It?

“Provider acceptability will be the biggest issue. Will providers want to put their names on the notes the system creates?” she said.

Developing the note-taking system also raises issues of equity, she explained. “We have to ensure we can leverage this for all patients and all providers, regardless of accents, for example.”

 Although they are growing in popularity, not everyone has embraced the use of a virtual digital assistants such as Amazon’s Alexa and Apple’s Siri in their daily lives, so  it remains an open question how rapidly they will become a routine in medical settings.

 “When will we decide that these voice assistants are an essential technology for healthcare, one that everyone has to use?” Kumah-Crystal said. “If we get them working right, they should be super easy, even for people who don’t like technology. Voice assistants should be invisible, and people should be able to trust that using them will pay off.”

About the Expert

Yaa Aboagyewa Kumah-Crystal, M.D.

Yaa Aboagyewa Kumah-Crystal, M.D., M.P.H., is an assistant professor of biomedical informatics and pediatric endocrinology at Vanderbilt University Medical Center. She is a leader in developing artificial intelligence to enhance ways in which providers engage with electronic health records.