Today’s Scientists, Tomorrow’s Leaders

Photo: Ernie Branson
What, exactly, goes on in NIH’s 200 intramural research labs? How are they different from university labs and why should a young investigator start a career here?
“NIH is an amazing opportunity,” NIH director Dr. Francis Collins told 29 early career researchers visiting NIH for the 2016 Future Research Leaders Conference (FRLC).
The event is sponsored by NIH’s inaugural chief officer for scientific workforce diversity Dr. Hannah Valantine. Her office invited 29 scientists from across the country to Bethesda for two important reasons: to introduce them to the unique work environment at NIH and to connect NIH intramural scientists to great talent.
In opening remarks, Valantine emphasized the human side of science. “Connecting people with common scientific interests is as important as the science itself,” she said.
And that is what the FRLC did. An intense 3-day experience featured talks from top leadership, including Collins, Valantine, IRP head Dr. Michael Gottesman and NHLBI director Dr. Gary Gibbons. Participants gave research talks, presented posters and met one-on-one with NIH scientific directors and branch chiefs—about 100 meetings were held. A half-day seminar on NIH grant process and meetings with NIH program and review staff rounded out the schedule.
Most conference participants are from underrepresented groups in the sciences. Most are past recipients of NIH-funded diversity supplements, which enable NIH-funded researchers at universities to bring diverse students and postdocs into their labs for research experiences and mentoring.
The FRLC is intentionally embedded within the NIH Research Festival, giving attendees the opportunity to fully engage with this annual celebration of NIH science.
Said attendee Esther Obeng, from Harvard Medical School, “I’ve met a lot of wonderful people…People that had been here for years and told me why they haven’t left…what they loved about NIH.”—Alison Davis
Plaque Commemorates Research Animals

Photo: Ernie Branson
The NIH intramural animal research advisory committee and the IC animal program directors dedicated a plaque that commemorates “research animals and the NIH animal care and use community that have contributed to our exceptional biomedical research advances” at a Sept. 16 ceremony on the Clinical Center’s south lawn. NIH director Dr. Francis Collins presented remarks at the program, part of Research Festival.
“I hope, going forward, when the next breakthrough happens here on the NIH campus—some big development that has big promise for human health—that people will walk by this plaque and recognize how we got there,” Collins said. “The animals that we depended on are also part of that celebration...we can look back on their contribution and their sacrifice and be truly grateful.”
Collins understands how important animals are to research. His laboratory studies Hutchinson-Gilford progeria syndrome, an exceedingly rare progressive disorder that causes children to age rapidly. Progeria affects roughly 250 children worldwide.
After discovering the cause of the syndrome, he began studying potential treatments. With help from the “wonderful” animal care staff at NHGRI, including its retired animal program director Dr. Shelley Hoogstraten-Miller, Collins’ lab developed mouse models of progeria. One therapeutic showed enough promise in mouse models to be tested in a clinical trial. Results from the trial suggest the drug extends the lives of patients by 4 or 5 years. A second complementary therapy is currently being tested.
“All of this is dependent on that mouse model and all of those animals that have been involved in this research,” Collins said.

Photo: Ernie Branson
He thanked staff involved in animal care at NIH. “I’m impressed and touched every day by what I see,” he concluded. “A lot of people don’t know about it, but I know about it. Thank you to all
of you.”
The ceremony also featured remarks by NIH deputy director for intramural research Dr. Michael Gottesman, NEI senior investigator Dr. Rachel Caspi and Hoogstraten-Miller. Dr. Terri Clark, director of the NIH Office of Animal Care and Use, hosted the ceremony.
Clark credited Hoogstraten-Miller for coming up with the idea to permanently recognize the contributions of research animals. The plaque, which features the NIH Office of Animal Care and Use logo along with the words, “With recognition and gratitude to the research animals and the NIH animal care and use community that have contributed to our biomedical research advances,” is affixed atop a tree-shaded stone near the CC’s south entry. Two benches surround the mulched area where the memorial sits.—Eric Bock
Symposium on Open Data, Prize
The NIH Big Data to Knowledge (BD2K) Initiative and the Office of the Associate Director for Data Science will hold an Open Data Science Symposium on Dec. 1. Big data is an underutilized resource for innovation and discovery in biomedical research and NIH is committed to unleashing its full potential by making it an open and easily accessible resource. The Open Data Science Symposium will feature discussions with the leaders in big data, open science and biomedical research while also showcasing the finalists of the Open Science Prize, a worldwide competition to harness the innovative power of open data.
Speakers include: NIH director Dr. Francis Collins; former NIH and NCI director Dr. Harold Varmus; John Wilbanks, Sage Bionetworks chief commons officer; Peter Goodhand, Global Alliance for Genomics and Health executive director; Niklas Blomberg, founding director of Elixir; and Robert Kiley, head of digital services at the Wellcome Library.
The six Open Science Prize finalists will demonstrate prototypes of big data tools currently under development that utilize publicly available datasets in environmental, epidemiological and health sciences to improve the health of individuals worldwide.
The event will be held from 8:30 a.m. to 4 p.m. at the Bethesda North Marriott Conference Center, 5701 Marinelli Rd., North Bethesda. For registration and agenda, visit http://event.capconcorp.com/wp/bd2k-odss/.
NIH Body Weight Planner Helps Users Set Weight Goals

For those who would like to lose weight, but don’t know where to start, the NIH Body Weight Planner might be a good place to begin.
Developed by NIDDK’s Dr. Kevin Hall, the Body Weight Planner forecasts how body weight changes after people alter their diet and exercise habits. It takes into account what happens when people of varying weights and diet and exercise habits try to reduce their weight.
The planner asks for a person’s height, weight and age. It also asks for his or her current physical activity level, what the goal weight is and the time frame for reaching the goal. Then, the planner calculates how many calories a person must eat to maintain current weight and reach and maintain the goal weight.
For information, see https://www.supertracker.usda.gov/bwp/index.html.
Confessions of a Heavy User

Photo: Carla Garnett
Dr. Maria Mills, a postdoc in NHLBI’s Laboratory of Single Molecule Biophysics, and her colleagues in senior investigator Dr. Keir Neuman’s group, are looking at how a protein interacts with a small molecule.
“We wanted to see whether this molecule changes the structure and dynamics of the protein,” she explained. “We do an atomistic complex simulation where every part of the protein and every part of the molecule—the physical equation of them—are all explicitly calculated…It’s very computationally expensive. It’s a massive amount of information that’s being calculated very quickly over and over again.”
Earlier this year, CIO Alastair Thomson, one of the folks who manages IT resources at NHLBI, approached Mills with an idea for a 2-month pilot project using Biowulf, a 20,000+ processor Linux cluster in NIH’s high-performance computing facility.
“They asked me to conduct a head-to-head comparison of my computational work on Biowulf and on the cloud to see if there were any advantages in using the cloud,” she said. “In terms of computational time, the performance was similar, but the cloud did save me time.”
With the cloud, she said, “you design a computer system that you want at the time and it’s immediately available to you. With Biowulf, we have a huge group of people sharing processors on a certain architecture and you have to wait until the processors that you need are free. Some days it’s quick. Some days you might be waiting a day or 2 days before your job’s run. You just have to wait it out. And since I tend to use a lot of processors for what I do—256 to 512 processors—sometimes I wait quite a while. And honestly, I sometimes feel like I’m hogging [the processors] a bit, because my jobs can take 60 hours.”
The work Mills is doing is considered basic research without a direct translational application. However, the specific protein system she’s analyzing is “involved in repairing problems that happen when DNA becomes too tangled, leading in humans to severe genetic defects, premature aging and susceptibility to cancer,” she said. “We’re not specifically looking for cures or treatment. We’re just trying to understand the system and maybe down the line people can use the information to help people who have these disorders.”
The lab uses what’s called a “magnetic tweezers manipulation technique,” Mills said. “We take a small piece of DNA, attach a magnetic bead to it and we can use the tweezers to manipulate the DNA. The proteins [we’re studying] reorganize DNA, unfold it, cut it and wrap it around itself. We can use these tweezers to measure what’s happening to DNA in the presence of these proteins. We can’t really see what’s actually happening to the proteins. That’s where the simulation comes in. It gives us a way to visualize the protein molecules since we can’t watch that directly.
“The magnetic tweezers experiments we’ve done indicate that the molecule we are looking at in the simulations stimulates the protein’s activity,” she said. “Hence our desire to understand how it affects the protein dynamics.”
Mills’ part of the process makes her a heavy processor user, which in turn made her an ideal candidate for testing a third-party IT provider. For the pilot, she worked closely with cloud provider technicians to design the best computer structure and write scripts to make things move faster.
“At first it took some optimizing, but that was just sort of the learning curve,” she concluded. “The cloud was really nice, because it was resources temporarily dedicated to me, so I didn’t have to worry about waiting for them and I didn’t have to worry about other people not having access to resources they needed. It was a virtual computer cluster that I built and I’m using, and when I’m done with it I just shut it off.”
Mills hopes to publish results from her pilot soon.
Software Forecast Bright with the Cloud

Photo: Carla Garnett
A few years ago, Dr. Michael Hansen and his NHLBI research group developed a new magnetic resonance imaging (MRI) framework called the Gadgetron. The MRI software takes an image’s raw signal data and quickly reconstructs them so clinicians can review images and diagnose disease.
MRI is a relatively slow and motion-sensitive technique. Patients have to hold their breath, for example, and there’s a lot of waiting involved, to see whether the clinician got the image clearly or has to repeat the procedure.
The Gadgetron software, however, can be used to shorten the wait time considerably and patients—oftentimes children—no longer have to hold their breath during the MRI scan.
Researchers at NIH, Children’s National Medical Center in Washington, D.C., and around the world routinely use Hansen’s software framework, which is available as open source software, capable of running on a wide variety of platforms.
Putting the Gadgetron to work, however, consumes a lot of IT resources, including up to 50 high-powered computers, Hansen said. The on-demand power offered through remote computing is what attracts the users of the framework.
“From a research perspective, where regular [usage rates] may change, the agility of the cloud is really valuable,” said Hansen, chief of the institute’s Laboratory of Imaging Technology and leader of a 5-person team of software engineers.
He recalled two “Aha!” moments in considering the cloud: One, when his team realized the extreme amounts of computational power they would need to run the Gadgetron, and two, when he figured the feasibility, cost and time of assembling his own data center on campus.
“The ability to change directions quickly has real scientific value,” noted Hansen, who latched onto the cloud idea early.
“The strength of the cloud is not in our ability to transfer lots of data there,” he explained, “it is the amount of computing power available. In the case of MRI, a typical clinical scan may be just a few gigabytes, but the processing time is long and that is where cloud computing is useful.”
Applications like the Gadgetron could potentially be deployed in a more traditional cluster-computing facility, but users would have to reserve time and wait for their reconstruction jobs to complete. In a clinical environment, such a deployment strategy is not feasible.
“The bottom line is this,” said Hansen, “when you have the patient on the scanner you cannot wait for a batch job to complete. You need dedicated resources to process the data as the scan is taking place. That is costly since you would need a large amount of computing power per scanner. A cloud deployment allows us to both scale flexibly when there is demand—pay for what we need—and share resources easily with multiple scanners/sites.”
In the context of the total length of a patient study, he explained, the costs associated with clinical staff, nurses and anesthesiologists become an important part of the equation.
“Say a given scan sequence based on breath-holding takes 10 minutes,” Hansen points out. “With free-breathing scanning and Gadgetron reconstruction, we might be able to turn that into a 5-minute sequence. The cloud computing cost is say $10 per hour while we do the scanning, but the cost of just 5 minutes of extra scan time is much more than that. So one can make an argument that cloud computing is also cost-effective.”
Combining scientific innovation with IT cost efficiency is what NHLBI CIO Alastair Thomson and his fellow CIOs want to provide for all NIH researchers.
“That’s very much how I see my role here,” Thomson said. “I try to give the PIs the tools and environment they need and get out of their way.”
Satisfied that Gadgetron users won’t realize the work is being done on the cloud, Hansen’s group now is fine-tuning the application and looking toward expanding availability.
“We hope to scale up and support the project to a stage where vendors pick it up,” he said.
Freeze Frame
How the Method Works

The contrast is striking. The cryo-EM microscope that the Subramaniam team uses for high-resolution structural work, a Titan Krios manufactured by FEI Co., stands more than 13 feet high and weighs in excess of 2,000 pounds. It’s designed to study individual molecules and even visualize single atoms—material far too tiny to be seen without high-tech help.
The first step in generating a structure, says Subramaniam, is suspending the chosen molecules in a solution and placing them on a fine mesh screen called a “grid.” Next the grid is plunged into liquid ethane to form a thin film of ice with the proteins in it. The film gets inserted into a cassette, which then goes into a cryo-capsule device that gets inserted into the Titan. Next, the microscope takes thousands of pictures, delivered every 60 seconds or so.
Because of the way electrons can interact with biological molecules like proteins, it can be complicated to get a very clear signal in any single image. In addition, all those images are in 2D. So to get a 3D picture of the protein at atomic resolution, “we [computationally] assemble all of those pictures together,” Subramaniam explains. When there’s enough information and everything is done correctly, “out comes the structure of the protein” at atomic resolution.
Cryo-EM is still mostly uncharted territory, however.

Photo: Veronica Falconieri
“We need to look at much larger complexes and be able to look at them at much higher resolution,” Subramaniam says. “The atomic resolution structures we have been posting are still of fairly well-defined complexes. But when you go to more complex systems, we have only achieved lower resolutions.”
The key for the future will be to capture all of the movement and dynamics of protein complexes, without losing resolution. The group will use methods they pioneered over the last decade to image whole cells at high resolution to better understand how these complexes function in the context of the whole cell.
So far, the NIH team is doing what it can to stay at the forefront of a very crowded and growing area of research.
“It’s just taken off,” Subramaniam enthuses. “There’s exponential growth in the field. Until 2013…it was really a niche field.” And now? “There’s a real buzz across many disciplines that this could be a very powerful addition to the biologist’s toolkit.”
PMI Stands Up, Takes First Steps
The NIH plan to implement the Precision Medicine Initiative Cohort Program drew wide support and applause at the ACD’s Dec. 10 meeting.
Ushering in a “new era of medicine through research, technology and policies,” the Precision Medicine Initiative aims to “empower patients, researchers and providers to work together toward development of individualized treatments.” The initiative has been on a fast track to fruition since President Obama announced it in his State of the Union address in January 2015.
NIH deputy director for science, outreach and policy Dr. Kathy Hudson, who served as cochair of the ACD’s working group on PMI, tag-teamed with interim cohort program director NCCIH director Dr. Josephine Briggs on a progress report.

Summarizing the working group’s recommendations from September, Hudson pointed out the importance of keeping NIH’s portion of PMI in context with the much broader scope of both the initiative and the global pursuit of precision medicine in general. PMI has two components at NIH—one at NCI and the cohort program. The latter was the focus of discussion at this ACD meeting.
PMI’s initial task, Hudson explained, is to assemble a generational unit, or “cohort,” of one million or more volunteers who reflect the diversity of the U.S., with a strong focus on underrepresented communities. The cohort will be longitudinal, with continuing interactions and opportunities built in for recontacting participants for secondary studies. Initial recruitment of a million participants is estimated to take 3 to 4 years using two methods: direct volunteer enrollment and via partnership with health care provider organizations.
Hudson said a driving issue that planners kept top of mind while devising the PMI cohort was how to maintain participant engagement and enthusiasm over such a long period of time.
“So what are the questions we want to be able to answer and how can we make sure that there is value for the research establishment, for health care providers, for the participants in the cohort and most important for the individual volunteers, early in the short term, in the medium term and in the long term?” Hudson asked, recalling deliberations to develop the cohort. “There’s a whole range of scientific opportunities that cover the waterfront—some that we can realize in the short term and some in the long term that we can’t even imagine today.”
She urged people who are interested in keeping up with cohort progress to use the dedicated web site www.nih.gov/precision-medicine-initiative-cohort-program and to follow #PMINetwork on Twitter.
Explaining the implementation steps already under way, Briggs described how the program will operate, its governance, enrollment targets and early budgetary expectations, announcement of the first funding opportunities and a proposed timeline.
“Translating the [myriad ideas and recommendations for the cohort] into a true, effective plan has been the work of many people,” she pointed out. “What has made the complex project manageable is the relatively high and meticulous level of detail from the working group report. Many key elements were clearly specified…building on the success of the BRAIN Initiative.”
ACD member Dr. Harlan Krumholz of Yale School of Medicine, heartily endorsed PMI’s fledgling steps.
“This is an extraordinary accomplishment,” he said. “There are a couple of revolutionary—not merely evolutionary—things about this that go far beyond the idea of being able to accumulate a million [participants]. One is this commitment to secure data fluidity and access, really going all in on the idea of open science…You are extolling values that will have ripple effects. The example and the principles you’re setting forth from the very outset of this, I believe, are going to have a fundamental impact on the way we see science going forward.
“The second revolutionary thing,” he continued, “is that you call them ‘participants.’ They’re not ‘patients’ and they’re not ‘subjects.’ The notion of partnership with the people who are going to be involved in this…It took leadership from the top—Francis Collins, I commend you—and it took courage to take this leap” from the traditional structure of medical science wherein scientists conduct the research and consult only each other about the results to this “adaptive, agile model of PMI,” wherein the power and knowledge that stem from science are shared from the beginning with the people.
Gilman Named Clinical Center CEO

Dr. James K. Gilman, a retired major general in the U.S. Army, has been named inaugural chief executive officer of the Clinical Center. He is a cardiologist and highly decorated leader with experience in commanding operations of numerous hospital systems. As CEO, he will oversee day-to-day operations and management of the CC, focusing on setting a high bar for patient safety and quality of care, including the development of new hospital operations policies.
“His medical expertise and military leadership will serve the Clinical Center well as it continues to strive for world-class patient care and research excellence,” said NIH director Dr. Francis Collins, who made the appointment.
Gilman served 35 years in the Army, most recently as commanding general of the U.S. Army Medical Research and Materiel Command, Ft. Detrick, Md. He led several Army hospitals during his long career—Brooke Army Medical Center, Ft. Sam Houston, Tex.; Walter Reed Health Care System, Washington, D.C.; and Bassett Army Community Hospital, Ft. Wainwright, Alaska. He also served as director of health policy and services responsible for all aspects of professional activities and health care policy in the Office of the Surgeon General, U.S. Army Medical Command. Gilman has received numerous military awards and decorations, among them the Distinguished Service Medal, Legion of Merit and Meritorious Service Medal.
He holds a bachelor of science in biological engineering from Rose-Hulman Institute of Technology, Terre Haute, Ind., and received his M.D. from Indiana University School of Medicine. He completed a residency in internal medicine and a fellowship in cardiovascular diseases at Brooke Army Medical Center, where he later became chief of cardiology and was responsible for training cardiology fellows. He is board-certified in internal medicine with a subspecialty in cardiovascular disease. Following his retirement from the Army in 2013, Gilman served as executive director of Johns Hopkins Military & Veterans Institute in Baltimore until June 2016.
'Cool' Factor a Bonus
Explore with NIH's 3-D Print Exchange

Photo: Carla Garnett
In 2007, Dr. Darrell Hurt, head of the computational biology section in NIAID’s Bioinformatics and Computational Biosciences Branch (BCBB), began experimenting with 3-D printing for molecular visualization. He started by taking raw molecular structure data and putting it in a 3-D-printable form.
“There’s so much more you can learn when you have it in your hands, rather than looking at it on a 2-D screen or even with 3-D glasses,” explains Dr. Meghan McCarthy, program lead for BCBB’s 3-D Printing and Biovisualization Program.
BCBB provides data analysis consultation and custom software development services to NIAID researchers and wanted to make 3-D printing more accessible to the public. As a solution, the branch created the NIH 3-D Print Exchange (https://3dprint.nih.gov), an online resource where people can find models and share their own models as well as web tools that eliminate skills barriers.
“We developed automated tools that are free. Anyone around the globe can use them,” says McCarthy, who manages the exchange under BCBB’s larger program for 3-D printing and biovisualization.
The exchange is owned and maintained by NIAID, but was initially created in partnership with NICHD and the National Library of Medicine with support from the 2013 HHS Ignite and 2014 HHS Ventures initiatives. Team members were recognized with an HHS Innovates award in 2015.
BCBB has processed more than 100 scientific 3-D printing requests over the last 10 years, reports BCBB Scientific Visualization Specialist James Tyrwhitt-Drake. “However, each request may include multiple different models or copies of the same model,” he says.

“Color printing of scientific models requires a considerable amount of preparation and processing time, so we primarily limit the service to requests from NIAID staff that have scientific utility. Production is limited by the physical complexity of the model, consumption of materials and occasionally troubleshooting mechanical or software issues with the printer.”
Over the years, as technology advanced, BCBB’s capabilities and responsibilities expanded to include other 3-D technologies, including virtual and augmented reality.
Tyrwhitt-Drake, along with team member and structural biologist Dr. Phil Cruz, also produces the virtual reality models that allow investigators to walk inside a molecule and have a look around its structure.
Use of virtual reality has “really taken off, especially in the last year,” McCarthy points out. “Anyone that we’ve ever put into the 3-D goggles—even people who have worked on a particular molecule for 10 or 15 years as part of their career—says they see something different, that they didn’t see before.”
Both 3-D printing and VR are useful applications for learning, sharing and education, as well as research, McCarthy notes. In addition to exploring their work from a unique perspective, researchers also seem to enjoy the “cool” factor.
“Most people think of gaming and entertainment when they think of VR,” McCarthy concludes, “but we really like that we’ve been able to bring this scientific value to it.”