MD Anderson Cancer Center is sitting on 23 petabytes of data, involving more than 2 billion diagnostic radiology images, generated by its massive IT infrastructure. But Chris Belmont, vice president and CIO, is not intimidated by the amount of data—he’s just scared of staring at it too long.
“Our biggest fear when we decided to move into Big Data was that, like many healthcare organizations, we would have a 2-year data ‘ingestion’ process where we’d keep thinking about that massive set of data, and connect all our systems big and small together, go get even more data from external sources, and then eventually offer our users an add-on tool and tell them to go at it,” Belmont says. “By the time we’d be done ingesting all that data, the time to change the game in terms of costs or population health would have already passed.”
MD Anderson, the Houston-based health system devoted to cancer care, is not the kind of organization to let time slip by.
The center has embarked on a large-scale analytics effort to better understand the myriad forms of cancers it treats and establish the therapies and medication regimens to combat them. In the past six months, it’s also put together a Big Data infrastructure focused on pulling nuggets from every nook and cranny from the enterprise to make it more efficient, be it procurement data, cost data, enterprise resource planning or another data set hanging out there. Central to that effort is speed: The center needs to be capable to take ideas for data tools, assess and quantify their value, and then build them within a matter of weeks.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment