skip navigation nih record
Vol. LXVII, No. 11
May 22, 2015
cover

previous story

next story



NIH Invests in Technology Upgrade to Support Research
(This is the first of a multi-part series exploring how technology enables NIH’s mission.)

With more than 1,600 laboratories conducting basic and clinical research and more than 10,000 new patients in 2014, the Clinical Center supports a high volume of traffic, with researchers, health care providers, patients and visitors occupying almost 2.6 million square feet. Whether a person is visiting NIH for a day, a week, or is a permanent staff member, the ability to communicate medical information or complex research data is imperative.

As of February 2015, visitors and staff can share information faster and more reliably. The Clinical Center’s NIH network connection is now at 10 times its former capacity.

According to Andrea Norris, CIT director and NIH chief information officer, NIH had not been able to keep up the technology infrastructure that facilitates the transport of data across the NIH network and the rest of the world.

“In numerous conversations with the scientific community, I learned more about their IT-related challenges,” said Norris. “Slow network speeds topped the list.”

Similar to the challenge facing the U.S. transportation infrastructure, funding for IT-related initiatives had not kept pace with rapid changes in biomedical and big data fields. NIH’s data highway system (or NIH network) was in many cases overcrowded, unreliable and aging.

Andrea Norris, CIT director and NIH chief information officer, held a number of conversations with the scientific community about their IT challenges. The verdict? “Slow network speeds topped the list,” she said.

Andrea Norris, CIT director and NIH chief information officer, held a number of conversations with the scientific community about their IT challenges. The verdict? “Slow network speeds topped the list,” she said.

“I asked a [principal investigator] how he sent large data files to his colleagues,” she said. “He pointed to his shoes and replied ‘Sneaker-net.’ Modernizing the NIH network quickly became a high priority.”

NIH leaders recognized the effort required critical investment.

“Complex multi-year initiatives such as modernizing the network and adding to our high-performance computing capabilities require strong leadership commitment and support,” Norris said. “We are so fortunate NIH leadership appreciates the importance of a technology infrastructure that supports our scientific mission.”

By working with the scientific community to discuss current expectations and future technology needs, a plan to improve the network’s security, reliability and transfer capabilities began, followed by the expansion of computational and storage capacity to meet NIH’s high-performance computing (HPC) demands.

In 2014, the first key piece of the infrastructure, or the NIH network core, was upgraded to accommodate 10 times its capacity, or to 100+ Gbps. The modernized NIH network provides a “science DMZ” designed for high-performance applications and big data and a faster, 100 Gbps-capable network connection from NIH to Internet2 (a community of U.S. and international researchers, academics and scientists who collaborate via networking technologies) was implemented.

As of May 2015, 17 facilities are upgraded, including major data throughput centers such as Bldgs. 30, 35, 49 and NCI’s 9609 Medical Center Drive. This effort included improvements to the Clinical Center’s main network input channel (10 to 100 Gbps) and within the building’s local area network (1 to 10 Gbps).

To meet biomedical demands, a concerted effort began in February 2014 to augment NIH’s HPC infrastructure. To date, NIH’s HPC capabilities have increased: 45 percent in computing capacity and 25 percent in data storage capacity, plus a 100 Gbps upgrade to the path between the HPC infrastructure and the NIH network.

“The impact has already been realized,” said Norris. “Early reports are data transfers are reducing by a factor of 4, which allows our research community to instead focus on their mission critical work instead of workarounds related to the network.”


back to top of page