Monday, March 13, 2017

UCLA Medical Center leverages artificial intelligence to develop virtual radiology advisor

Interventional radiologists at the UCLA Medical Center are using artificial intelligence to develop a “chatbot” that automatically interacts with referring clinicians, giving them with evidence-based answers to frequently inquired queries.

Presently, the AI-powered prototype is being tested by a small UCLA Medical Center group of hospitalists, radiation oncologists and interventional radiologists. The machine learning application, which acts like a virtual radiology advisor, enables clinicians to quickly access valuable data while enabling them to conduct other duties and to concentrate on patient care.

The data is delivered in several formats, involving relevant websites, infographics, and subprograms within the application. And if the device determines that an answer needs a human response, contact data for an actual interventional radiologist is provided. As clinicians use the application, which is concentrated on diagnostic and interventional radiology, it learns from each encounter and becomes smarter via deep learning techniques that give evidence-based answers.

“The more it is used, the smarter it gets,” claims Kevin Seals, MD, resident physician in radiology at UCLA Medical Center and the programmer of the application, who points out that the application’s user interface contains text boxes arranged in a manner simulating communication through traditional SMS text messaging services.

“It feels like you are texting with a human, but you are texting with artificial intelligence, so the reactions are coming from a computer,” analyzes Seals, who has a background in engineering. “For clinicians in the hospital who are not radiologists, it is a way to speak with a simulated radiologist.”

To establish the knowledge of application, Seals claims that the researchers fed it more than 2,000 example data points simulating usual questions that interventional radiologists get during a consultation. He adds that natural language processing was executed using the Watson Natural Language Classifier application program interface, so that user inputs are understood and paired with relevant data categories of interest to clinicians.

For instance, if a clinician inquires whether placement of an inferior vena cava filter—a medical device that is implanted by interventional radiologists—is suitable for a specific patient, they will be paired with an IVC filter category and relevant data will be given.

Previous week, Seals presented research on the application at the Society of Interventional Radiology’s 2017 Annual Scientific Meeting in the state of Washington. While the machine learning application is concentrated on diagnostic and interventional radiology, he asserts that it could finally be applied to other medical specialties.

“It is getting actually close to the point where we’d like to have a wider release across the UCLA Medical Center,” summarizes Seals. “It works very well now. About 90% of its functionality, roughly, gets it right every single time. The difference between it working actually well and it working potentially perfectly is merely entering more information so that it becomes smarter.”

 

No comments:

Post a Comment