IBM Sends Jeopardy Supercomputer to Medical School
IBM’s Watson may have trounced former champion Ken Jennings in Jeopardy, but now it’s facing an even bigger challenge: proving that it can make money for its creators.
It’s well on the way. Last week, IBM said that it was working with Citi to “explore how the Watson technology could help improve and simplify the banking experience,” but for the past six months, Big Blue has also teamed up with health insurer WellPoint to turn Watson into a machine that can support the doctors of the world.
IBM isn’t saying too much about what Watson will be doing at Citi. The two companies plan to build “the first consumer banking applications” for the supercomputer. WellPoint is a bit more forthcoming. In December, the health insurer said that it was working with Cedars-Sinai Hospital’s Samuel Oschin Comprehensive Cancer Institute to help physicians treat cancer patients.
Does Watson Choke?
There was one other thing we had to ask IBM scientist David Gondek: What were the chances that Watson would choke on-camera in the midst of Jeopardy?
That was a possibility because Watson wasn’t connected to the internet. If Jeopardy had picked some area that the quiz show had not traditionally ever asked about, the IBM team could have struck out, even though they’d crammed Watson with things like Shakespeare, the Bible, song lyrics, scientific textbooks. Watson is smart enough not to hit the buzzer unless it thinks it’s got a good shot at the correct answer, but it could have gone silent. “Our real fear was that we’d just have one of these boards where it was just a bunch of topics that Watson didn’t have in its corpus,” he says, “so we just wouldn’t end up answering any questions.”
That never happened and Watson clobbered the two former champions it was pitted against. But could things have gone otherwise? Has Gondek seen a Watson-killer category since Watson appeared on Jeopardy?
He doesn’t really know. “I’ve probably seen about 200 games’ worth of Jeopardy questions and studied them,” he says. “So after the game I needed to take a break.”
It turns out that training Watson to help doctors and financial services customers has a lot in common with cramming for Jeopardy. In both cases, the computer has to do two things that machines have traditionally flubbed. First off, there’s natural language processing. That means figuring out what the question actually is. Then there’s machine learning: understanding what facts are important for which question. In others words, you give Watson questions, it gives you answers.
“If we can parse the clue and understand what the question is asking about, and we can parse this text in our documents and understand what our text is talking about then we can try to match,” says David Gondek, a scientist with IBM who has worked on Watson for the past five years.
But there are differences too. When Watson helps out Cedars-Sinai doctors, it’s not engaging in a vicious death-match for Triva-God bragging rights. It’s a collaboration.
“It’s a very different situation. Because in Jeopardy we were kind of constrained in that you get a question, you get an answer, and that’s it,” says Gondek. “In the medical case, we think more about interacting with a medical professional…. that means that it’s not just a question and answer.”
The processing is different, too. The Jeopardy system was trained to answer quiz show questions, where the answers are pretty much black and white. Feed Watson a copy of the Bible, and it’s pretty much good to go on Bible trivia questions. In business and medicine, there are a lot of different sources, and some of them are considered more important than others.
So IBM is working with doctors to ensure that it has the right data sources and that the different sources its using — medical journals, papers and textbooks — are given each given the proper weight.
“We don’t want to come back with a single answer,” says Gondek. “We want to come back with a set of answers and our justification behind them.
“Watson will give you the confidence and say: ‘I”m 90 percent sure of this, or I’m only 10 percent sure of this.’ You can immediately see how that is useful in medicine or in finance. The other thing we can do is we can tell the user why is this answer here. What kinds of evidence do we use, what facts did we use, what were we sure about, and what were we not sure about? And were the documents we used from very reliable sources or from less-reliable sources.”
That kind of context is hard to get nowadays using internet sources. Last summer, when Gondek was exploring the idea of using Watson for medical diagnoses, he ran into a San Diego doctor at a health conference in California. “He said nowadays when he is dealing with patients, he spends the first 10 minutes talking to them about all the diagnoses they found by doing searches,” he says. “It’s just human nature. They will focus in on the most severe or life-threatening one. They will have a few mild systems and will decide that they have some horrible cancer. And so he has to talk them down.”
Gondek’s dream is that Watson could somehow help doctors and patients get a better context on their healthcare — and help financial service customers get the same kind of weighted context on their investments. “What if something like Watson could get you more involved with your health?”
Emergency Medicine Resident Physician Iltifat Husain believes that Watson could never replace a doctor, but he says that it could be turned into a useful medical triage system, where patients tell Watson their symptoms and it figures out whether they need to come into the hospital for treatment.
There’s a lot that Watson isn’t going to be able to do, no matter how hard it crams, says Husain, who works at the Wake Forest University School of Medicine in Winston-Salem, North Carolina. He’s taking about the very human act of getting a read on a patient: How does their voice change when describing symptoms? How do they sit? What do their eyes say? “One of the first things I learned about medicine once I actually started practicing as a physician was medical texts only provide you with 50 percent of the education you need,” he says. “The rest you learn on the job.”
WellPoint says its medical learning application is still a year away. In the meantime, Gondek and the IBM researchers are putting Watson through its own machine learning bootcamp.
“It’s a little like sending Watson to medical school. We don’t just push a button and instantly Watson can offer medical advice,” Gondek says. “We need experts to show us what’s important in the domain. We need experts to come up with test scenarios that Watson can learn from.”