Clinical OMICS

SEP-OCT 2018

Healthcare magazine for research scientists, labs, pathologists, hospitals, cancer centers, physicians and biopharma companies providing news articles, expert interviews and videos about molecular diagnostics in precision medicine

Issue link: https://clinicalomics.epubxp.com/i/1023557

Contents of this Issue

Navigation

Page 30 of 51

www.clinicalomics.com September/October 2018 Clinical OMICs 29 Watson Recommends Incorrect Cancer Treatments, System Training Questioned IBM's Watson for Oncology is often paraded as a supercomput- er that leverages artificial intelligence (AI) algorithms to spit out cancer treatment recommendations for busy clinicians. Now, however, internal documents unearthed by STAT show instances of "unsafe" or "incorrect" recommendations provided by Watson. For example, presentation slides showed Watson recommending a regimen of chemotherapy plus bevacizumab for a 65-year-old patient with lung cancer and signs of severe bleeding. The prob- lem is bevacizumab can cause severe bleeding, making the drug contraindicated for this patient. The documents largely pin the questionable recommendations on the way Watson was trained by doctors at Memorial Sloan Kettering Cancer Center and IBM engineers. Instead of being fed heaps of real patient data, Watson was served small quantities of synthetic cases. "What we typically think of when we're thinking of AI applied to healthcare is wanting to translate Big Data into improved health outcomes of our patients," said Constance Lehman, M.D., Ph.D., professor of radiology at Harvard Medical School and chief of breast imaging at Massachusetts General Hospital. Big Data means having high quality and high quantity data, and "that's re- ally not what has happened to date in the IBM Watson project." Slides from a July 2017 presentation revealed that Watson had been trained with only hundreds of synthetic cases, ranging from 635 cases for lung cancer to 106 cases for ovarian cancer. Further, a document from June 27 showed that only one or two doctors trained the supercomputer with synthetic cases, generating con- cern that Watson was trained to recommend, or think, like those few doctors. AI-powered computers could be trained to mimic ex- pert panels, evidence-based guidelines, or even the consensus at a single institution, all depending on the data fed in. "There is no one formula that's best for evaluating and develop- ing an AI product in healthcare. There are many different avenues that people are taking and they all have strengths and weakness- es and they are all important," said Lehman. "Anyone that's ever participated in a multidisciplinary confer- ence or a discussion of the best treatment for a patient understands these decisions are very complex. They pull in multiple different pieces of in- formation about the patient," explained Lehman. "To try to have such a complex de- cision-making process man- aged by a computer—unless you have very big databases and you're very clear on what you're trying to predict—it be- comes messy, and I think that is what has happened here." IBM Watson Health did not respond for comment on their inter- nal documents. —Christina Bennett n Abraham Schwarzberg, M.D., chief of oncology at Jupiter Medical Center in Palm Beach County, FL, reviews recommendations generated by IBM Watson for Oncology. Constance Lehman, M.D., Ph.D., professor of radiology, Harvard Medical School. not where the AI is making the clinical decisions. The AI is freeing up the clinician to spend more time thinking about their case." The field of cancer treatment changes so rapidly that in order to build an AI-based model and then prospectively test its ability to make clinical decisions, one would almost have to freeze the available cancer treatments in order to validate the model, Culot said. "When some people talk about AI related to precision medicine, I think they might envision a world in which the computer is making all of the decisions for the clinician," said Freimuth. "I don't see that happening any time soon because while medicine is a very scientific-based practice, it is also a social practice. The introduction of any new technology into clinical care requires not only scientific validity but also socialization and acceptance by the clini- cal community. "For AI to become a more active part of clinical practice, clinicians need to have a better understanding of what AI can provide, both in terms of its strengths, as well as in terms of its limitations. And they would need to develop a trust for what that computer system is telling them."

Articles in this issue

Links on this page

Archives of this issue

view archives of Clinical OMICS - SEP-OCT 2018