||Harvard's Dr. Roger Pitman at STEP forum
The recent Staff Training in Extramural
Program (STEP) forum, "Emerging Ethical Issues in Neuroscience
Research," gave the crowd in Lister Hill Auditorium a chance to join in-and hash out-neuroethics history in the making.
"We are all learning about the brain, but the mind remains an enigma," said moderator Dr. Ruth Fischbach, director and cofounder of the Center for Bioethics at Columbia University.
Neuroethics is a specialty in progress-an emerging field with critical implications for clinical research and treatment, medical education and human rights. The field is so new that scholars and scientists are still working out a definition, variously cited as a brain-based philosophy; a discipline that queries the social issues of disease; and a bioethical subspecialty confronting new technology.
Neuroethics, said Fischbach, encounters several burgeoning fields at once: psychopharmacology; brain imaging; regenerative neurology, including
stem cell therapy; brain/computer interfaces
such as robotics; neurogenetics; and deep brain stimulation (DBS), which uses electrodes to stimulate brain tissue.
Consider two patients, both treated with DBS. The first, with Parkinson's disease, showed remarkable improvement. The second, with depression, became suicidal until the electrodes were adjusted by a mere tenth of a centimeter. Luckily, it did the trick. But what if it hadn't?
Any treatment can be risky, with unintended consequences, and when the locus of therapy is the brain, said Fischbach, we must be circumspect:
"It's not what you can do; it's what you should do." She continued: "We may have to reign in the technological imperative"-the notion that technology has a life of its own, and that whatever can be done therefore must be done. This is where neuroethics can, and should, break new ground and establish new guidelines, "especially to protect vulnerable patients," said Fischbach.
Panelists tackled four distinct, contentious topics. The first speaker, Dr. Laurie Zoloth of Northwestern University, posed the central questions that neuroscience shares with philosophy:
How are we human? Do we have free will? What is our responsibility to one another?
These questions, so intrinsic to our being, reflect the need for policy regulations, Zoloth said. Some interventions can alter the self, so for consent to be informed, we will need to ask, "Which 'self' is consenting?" We need to ask about the question of justice: could neurotechnologies
make certain injustices in the health care system worse by allowing enhancement for some and not for others? And what of religious issues? Neuroscience touches on "the nature of the soul" and on the question of moral choice and free will. Are moral gestures for good or evil choices just neuronal firings?
||Participants in the recent STEP forum on neuroethics included Dr. Laurie Zoloth (l) and Dr. Judy Illes.
"Bioethics begins after the Holocaust, at the trials
in Nuremberg, when life-and-death decisions about human worth were based merely on biological
characteristics," explained Zoloth, "and we thus fear the idea that biology determines the worth of the self. Such an idea could be carried
wholesale into neuroscience." Could neurologically
applied technologies be used to undermine
our resistance to evil? Could they be used for war, by a techno-state? If the brain is reduced to a parts list, can we put it back together?
Stanford's Dr. Judy Illes, exploring the intersection
of neuroimaging with ethical behavior, called her approach "very pragmatic," and, given that the Neuroethics Society was formed only a year ago, urged an early and complete integration
of ethics into neuroscience.
"Are we scholars or reformers? Both," Illes said. She cautioned against nonclinical applications such as functional imaging (fMRI) used in lie detection, and warned of products hawked to parents fearful that their children aren't excelling
in school. Such commercial uses create tension
between the academy and industry: "We really are in a brave neuro world and we need to pay attention," she said. The culture of imaging has changed, since "there is meteoric growth. The press tracks the growth. And the press is not as cautious as we'd like in statements like 'the brain can't lie' and 'brain scans reveal how you think.'"
She described how, after the Virginia Tech rampage,
a reporter asked her: "Why didn't somebody
use fMRI to detect brain pathology in this killer?" Illes continued: "Clearly, this technology
is not ready for that type of application, but in the public image it is, and I got the sense from the reporter that we had failed her. It was important to defuse that, and I think that is one of our obligations as we move forward."
Harvard Medical School's Dr. Roger Pitman described his own brush with the media concerning
his work with post-traumatic stress disorder (PTSD) patients. He was surprised, he said, at how his research was characterized. "The science is preliminary," he stressed. "We aren't even sure it will be used. I wish we had 5 percent of the power the press says we have."
Pitman used the drug propranolol "as a secondary
[PTSD] preventive measure" on 41 subjects
who had experienced serious trauma. The question was whether the drug, given within 6 hours of the precipitating incidents, would prevent
After 3 months, those who'd received it had significantly less physiological evidence of PTSD than the control subjects, who had gotten
"Yes, it does open ethical issues," said Pitman, "but it is not going to wipe out memory so that soldiers can go out and kill civilians." It doesn't cause amnesia, he emphasized; rather, it blocks stress hormones from potentiating memories. He also described ongoing studies using propranolol
that showed the drug weakening the conditioned fear response, with a greater effect in females than in males.
During Q&A, Pitman said that many institutional
review boards had looked at whether
harm was done by making patients relive the trauma, since certain studies used scripts reprising the original events. "The great majority
of patients do better," he said, although about 1 percent do worse.
The last speaker, the University of Pennsylvania's
Dr. Jonathan Moreno, promoted the interrelation
of national security with neuroscience. "I'm a DARPA booster because I'm a realist," said Moreno, who has written extensively about the Defense Advanced Research Projects Agency,
the main research and development arm of the Department of Defense. He outlined the CIA's post-WWII use of LSD in human subjects and other "PsiOps"; DARPA studies using honeybees
to detect explosives; as well as efforts to keep soldiers alert for more than 4 hours in combat situations. Citing NIMH studies on using drugs such as modafinil as amphetamine alternatives, he said, "Just saying no to DARPA funding is not the right answer."
Other questions tackled issues of interrogation, cognitive enhancement and how to prevent brain-mapping data being used in self-incrimination.
Nobody, it seemed, wanted to wake up inside The Matrix.
Philosopher Zoloth brought it all home: "Decisions
that used to be medical matters are now matters for Congress to debate. This is a profound
change: now all science is political