Software Forecast Bright with the Cloud
A few years ago, Dr. Michael Hansen and his NHLBI research group developed a new magnetic resonance imaging (MRI) framework called the Gadgetron. The MRI software takes an image’s raw signal data and quickly reconstructs them so clinicians can review images and diagnose disease.
MRI is a relatively slow and motion-sensitive technique. Patients have to hold their breath, for example, and there’s a lot of waiting involved, to see whether the clinician got the image clearly or has to repeat the procedure.
The Gadgetron software, however, can be used to shorten the wait time considerably and patients—oftentimes children—no longer have to hold their breath during the MRI scan.
Researchers at NIH, Children’s National Medical Center in Washington, D.C., and around the world routinely use Hansen’s software framework, which is available as open source software, capable of running on a wide variety of platforms.
Putting the Gadgetron to work, however, consumes a lot of IT resources, including up to 50 high-powered computers, Hansen said. The on-demand power offered through remote computing is what attracts the users of the framework.
“From a research perspective, where regular [usage rates] may change, the agility of the cloud is really valuable,” said Hansen, chief of the institute’s Laboratory of Imaging Technology and leader of a 5-person team of software engineers.
He recalled two “Aha!” moments in considering the cloud: One, when his team realized the extreme amounts of computational power they would need to run the Gadgetron, and two, when he figured the feasibility, cost and time of assembling his own data center on campus.
“The ability to change directions quickly has real scientific value,” noted Hansen, who latched onto the cloud idea early.
“The strength of the cloud is not in our ability to transfer lots of data there,” he explained, “it is the amount of computing power available. In the case of MRI, a typical clinical scan may be just a few gigabytes, but the processing time is long and that is where cloud computing is useful.”
Applications like the Gadgetron could potentially be deployed in a more traditional cluster-computing facility, but users would have to reserve time and wait for their reconstruction jobs to complete. In a clinical environment, such a deployment strategy is not feasible.
“The bottom line is this,” said Hansen, “when you have the patient on the scanner you cannot wait for a batch job to complete. You need dedicated resources to process the data as the scan is taking place. That is costly since you would need a large amount of computing power per scanner. A cloud deployment allows us to both scale flexibly when there is demand—pay for what we need—and share resources easily with multiple scanners/sites.”
In the context of the total length of a patient study, he explained, the costs associated with clinical staff, nurses and anesthesiologists become an important part of the equation.
“Say a given scan sequence based on breath-holding takes 10 minutes,” Hansen points out. “With free-breathing scanning and Gadgetron reconstruction, we might be able to turn that into a 5-minute sequence. The cloud computing cost is say $10 per hour while we do the scanning, but the cost of just 5 minutes of extra scan time is much more than that. So one can make an argument that cloud computing is also cost-effective.”
Combining scientific innovation with IT cost efficiency is what NHLBI CIO Alastair Thomson and his fellow CIOs want to provide for all NIH researchers.
“That’s very much how I see my role here,” Thomson said. “I try to give the PIs the tools and environment they need and get out of their way.”
Satisfied that Gadgetron users won’t realize the work is being done on the cloud, Hansen’s group now is fine-tuning the application and looking toward expanding availability.
“We hope to scale up and support the project to a stage where vendors pick it up,” he said.