Neuroscience, Special Forces, and Ethics at Yale

 

Last month, a proposal to establish a US Special Operations Command (SOCOM) Center for Excellence in Operational Neuroscience at Yale University died a not-so-quiet death. The broad goal of “operational neuroscience” is to use research on the human brain and nervous system to protect and give tactical advantage to U.S. warfighters in the field. Crucial questions remain unanswered about the proposed center’s mission and the unusual circumstances surrounding its demise. But just as importantly, this episode brings much needed attention to the morally fraught and murky terrain where partnerships between university researchers and national security agencies lie.

A Brief Chronology 

Let’s start with what transpired, according to the news reports and official press releases. In late January, the Yale Herald reported that the Department of Defense had awarded $1.8 million to Yale University’s School of Medicine for the creation of the new center under the direction of Yale psychiatrist Charles Morgan III. Descriptions of the proposed center’s work revolved around the teaching of Morgan’s interviewing techniques to U.S. Special Forces in order to improve their intelligence gathering. To heighten the soldiers’ cross-cultural awareness and sensitivity, Morgan reportedly intended to draw volunteer interviewees from New Haven’s immigrant communities.

Such details typically become public only after a university center has been formally established and its funding officially secured. In this case, however, the early news reports — which included statements from director-to-be Morgan — quickly led last month to a widely circulated Yale Daily News op-ed, an online petition, a Facebook page, and protests by students and local groups outraged over reports of Yale’s support for the military center and plans to treat immigrants as “guinea pigs.” According to ABC News/Univision, in response Morgan explained that he was approached by the Defense Department to help “promote better relations between U.S. troops and the people whose villages they work in and around” — by teaching soldiers “better communication skills” and “how to ask non-leading questions, how to listen to what people are saying, how to understand them.”

A public affairs officer for U.S. SOCOM initially confirmed that it was providing funding for the center. Shortly thereafter, Yale University representatives issued a conflicting statement. Characterizing the center as “an educational and research center with a goal of promoting humane and culturally respectful interview practices among a limited number of members of the armed forces, including medics,” they emphasized that no formal proposal had been submitted for academic and ethical review. Yale also noted that volunteer interviewees “selected from diverse ethnic groups” would be protected by university oversight, and that public reports about the center were in part “based on speculation and incomplete information.” Three days later, SOCOM’s spokesperson retracted his previous statement, explaining that the information provided had been incorrect, and that no funds for the center would be forthcoming. Yale confirmed that the center would not be established at the university. Two days later, SOCOM declared that, in fact, they had decided a year earlier not to fund Morgan’s proposal.

Ethical Risks of Operational Neuroscience

The name of the proposed center — the U.S. SOCOM Center of Excellence for Operational Neuroscience — deserves more attention and scrutiny than it has received thus far.The burgeoning interdisciplinary field of operational neuroscience — supported by hundreds of millions of dollars from the Department of Defense — is indisputably much larger and much more worrisome from an ethical perspective than the mere teaching of interview techniques and people skills would suggest. What makes this particular domain of scientific work so controversial is not only its explicit purpose of advancing military goals. The methods by which these ends are pursued are equally disquieting because they raise the specter of “mind control” and threaten our deeply held convictions about personhood and personal autonomy.

In a presentation to the intelligence community five years ago, program manager Amy Kruse from the Defense Advanced Research Projects Agency (DARPA) identified operational neuroscience as DARPA’s latest significant accomplishment, preceded by milestone projects that included the Stealth Fighter, ARPANET, the GPS, and the Predator drone. National security interests in operational neuroscience encompass non-invasive, non-contact approaches for interacting with a person’s central and peripheral nervous systems; the use of sophisticated narratives to influence the neural mechanisms responsible for generating and maintaining collective action; applications of biotechnology to degrade enemy performance and artificially overwhelm cognitive capabilities; remote control of brain activity using ultrasound; indicators of individual differences in adaptability and resilience in extreme environments; the effects of sleep deprivation on performance and circadian rhythms; and neurophysiologic methods for measuring stress during military survival training.

Anthropologist Hugh Gusterson, bioethicist Jonathan Moreno, and other outspoken scholars have offered strong warnings about potential perils associated with the “militarization of neuroscience” and the proliferation of “neuroweapons.” Comparing the circumstances facing neuroscientists today with those faced by nuclear scientists during World War II, Gusterson has written, “We’ve seen this story before: The Pentagon takes an interest in a rapidly changing area of scientific knowledge, and the world is forever changed. And not for the better.” Neuroscientist Curtis Bell has called for colleagues to pledge that they will refrain from any research that applies neuroscience in ways that violate international law or human rights; he cites aggressive war and coercive interrogation methods as two examples.

Research Misapplied: SERE and “Enhanced Interrogation Techniques”

Some may argue that these concerns are overblown, but the risks associated with “dual use” research are well recognized and well documented. Even though a particular project may be designed to pursue outcomes that society recognizes as beneficial and worthy, the technologies or discoveries may still be susceptible to distressing misuse. As a government request for public comment recently highlighted, certain types of research conducted for legitimate purposes “can be reasonably anticipated to provide knowledge, information, products, or technologies that could be directly misapplied to pose a significant threat with broad potential consequences to public health and safety”.”

Yale’s Morgan must surely be aware that operational neuroscience research can be used for purposes contrary to its purported intent — as this appears to be what happened with some of his own work. Morgan’s biographical sketch on the School of Medicine website refers to his research on the “psycho-neurobiology of resilience in elite soldiers” and “human performance under conditions of high stress.” Both of these topics are related to his extensive study of the effects of the military’s physically and psychologically grueling Survival, Evasion, Resistance, and Escape (SERE) training program. In SERE training, soldiers are subjected to extreme conditions in order to inoculate them against enemy interrogation should they be captured and subjected to torture by forces that don’t observe international laws prohibiting prisoner abuse. The techniques applied during the trainee’s simulated incarceration and mock interrogations include isolation, stress positions, sleep and food deprivation, loud noises, sexual humiliation, extreme temperatures, confinement in small spaces, and in some cases waterboarding.

Along with colleagues, Morgan has published a series of research articles examining the psychological, physiological, and biological effects of the SERE program. In summarizing key findings of this research, Morgan and his co-authors highlighted the following: the stress induced by SERE is within the range of real-world stress; SERE students recover normally and do not show negative effects from the training; and the mock interrogations do not produce lasting adverse reactions as measured by physiological and biological indicators. However, after reviewing these same studies, the authors of a Physicians for Human Rights report reached a starkly different conclusion: “SERE ” techniques, even when used in limited and controlled settings, produce harmful health effects on consenting soldier-subjects exposed to them.” They also emphasized that during the training many students experienced dissociative reactions and hormone level changes comparable to major surgery or actual combat; the post-training assessments were short-term and insufficient to evaluate soldiers for PTSD and related disorders; and the soldiers benefited from knowing that they could end their participation whenever they chose to do so.

SERE research like that conducted by Morgan and his colleagues was subsequently misused by the Bush Administration after the 9/11 terrorist attacks to illegitimately authorize the abuse and torture of national security detainees held at Guantanamo Bay, Bagram Air Base, and CIA “black sites.” The infamous “enhanced interrogation techniques” (EITs) were developed by former SERE psychologists — working for the CIA — who “reverse-engineered” the SERE interrogation tactics. But even more importantly here, a crucial 2002 Office of Legal Counsel “torture memo” asserted that the EITs did not cause lasting psychological harm, and it cited as evidence consultation with interrogation experts and outside psychologists, as well as a review of the “relevant literature” — which plausibly would have included Morgan’s own extensive work in the area. In short, this appears to be a striking and tragic instance where operational neuroscience research, undertaken in a different context, was subsequently appropriated and misapplied for unconscionable purposes. It is worth adding that these prisoners were subjected to indefinite detention without trial and they were not free to discontinue their torturous interrogations at will. Their torture sessions were also substantially longer and the techniques were instituted more frequently and with greater intensity than Morgan’s research subjects experienced.

Morgan’s Deception Detection Research

Another significant area of operational neuroscience research for Morgan has been deception detection — that is, figuring out when someone isn’t being truthful during an interview, or an interrogation. According to his online CV, he has received Department of Defense funding totaling nearly $2 million for this work over the past several years. Research on this same topic reportedly also became an important focus of attention for several intelligence agencies — including the CIA — immediately after the 9/11 attacks. Befitting his expertise and stature in the field, Morgan has been involved in a variety of high-level initiatives designed to bring together university researchers and personnel from the defense and intelligence sectors.

For example, Morgan is among the listed attendees at a July 2003 invitation-only workshop on “The Science of Deception: Integration of Theory and Practice.” The event was co-hosted by the American Psychological Association (APA) and the RAND Corporation, with generous funding from the CIA. The participants discussed various scenarios, including one focused on law enforcement interrogation and debriefing, and another on intelligence gathering. They also explored specific research questions, such as which pharmacological agents affect truth-telling, and whether it might be possible to overwhelm a person’s senses so as to reduce his capacity to engage in deception during an interrogation. Psychologist Jeffrey Kaye has noted that, in a very unusual step, the APA has scrubbed most of the information about this workshop from its website.

In June 2004 Morgan was a participant at another invitation-only workshop — co-sponsored by the Department of Justice, the FBI, and the APA — titled “The Nature and Influence of Intuition in Law Enforcement: Integration of Theory and Practice.” Among the topics examined were the extent to which police officers, intelligence analysts, interrogators, and others can effectively use “intuition” in their work — for instance, in order to detect deception — and how such capabilities might be applied to counterterrorism efforts. The proceedings from this event identify Morgan as “Senior Research Scientist, Behavioral Science Staff Central Intelligence Agency” — a professional affiliation that does not appear on his online CV.

Morgan is credited with a similar affiliation in the 2006 report “Educing Information,” published by the National Defense Intelligence College. As a member of the Government Experts Committee, Morgan is listed as working for the “Intelligence Technology Innovation Center,” an administrative unit that falls under the CIA. The foreword to the report describes the volume as “a primer on the “science and art’ of both interrogation and intelligence gathering.” Included is a chapter on deception detection by Morgan’s close research colleague, psychologist Gary Hazlett. One of Hazlett’s recommendations in the report is that “the United States adopt an aggressive, focused plan to support research and development of enhanced capabilities to validate information and the veracity of sources.” He also notes that the most troubling limitation of deception research thus far is the lack of “various Asian, Middle Eastern, Central and South American, or African populations” as research participants.

Responding to Morgan’s reported plans for a new center at Yale, local advocacy group Junta for Progressive Action issued a statement of concern last month. It noted that, “As a city that has worked to establish itself as a welcoming and inclusive city for immigrants, the idea of targeting immigrants specifically for the purpose of identifying the distinction of how they lie is offensive, disrespectful and out of line with the values of New Haven.” In a recent newspaper report, Morgan called rumors that the proposed center at Yale would teach new interrogation techniques mere “hype and fantasy,” explaining that he instead “suggested to the Army that perhaps some training in people skills — how to talk to and listen to people might be helpful and create better relations.” Even assuming that this reassuring account is true, it’s certainly not unreasonable to question whether deception detection research and training might have been part of the proposed center’s future operational neuroscience agenda.

Classified and Unclassified Research on Campus

There are broader questions beyond those focused specifically on the uncertain details and background surrounding the not-to-be Center of Excellence for Operational Neuroscience at Yale. The unusual sequence of events that unfolded in New Haven last month should ideally serve as a springboard for open discussion of the opportunities and pitfalls associated with research partnerships between universities and national security agencies. To its credit, Yale University has a clear  policy that explicitly prohibits its faculty from conducting secret or classified research:

“The University does not conduct or permit its faculty to conduct secret or classified research. This policy arises from concern about the impact of such restrictions on two of the University’s essential purposes: to impart knowledge and to enlarge humanity’s store of knowledge. Both are clearly inhibited when open publication, free discussion, or access to research are limited.”

But not all academic institutions have such stringent rules, which are necessary to promote full transparency, informed critiques by other scholars and researchers, and constructive engagement beyond the walls of higher education institutions. At the same time, it should be noted that, even at Yale, voluntary faculty members — Morgan’s official status at the university — do not need to disclose research activities that are not being conducted on behalf of Yale.

Some of the most challenging ethical issues remain even when classified research is not conducted on university campuses. As psychologist Stephen Soldz has highlighted, in cases of unclassified research funded by national security agencies, the academic researchers are not necessarily informed about the totality of the projects to which they are contributing. He offers the example of findings from seemingly uncontroversial deception detection studies, which may ultimately become the basis for the capture, indefinite detention, and torturous interrogation of prisoners in undisclosed locations — well beyond the university researchers’ awareness. Soldz also warns that researchers may never know if their campus work has become “part of a vast secret effort to unlock the mystery of mind control and develop techniques for coercive interrogations, as happened to hundreds of behavioral scientists and others in the decades of the CIA’s MKULTRA and other Cold War behavioral science initiatives.” These risks are further exacerbated for psychologists, psychiatrists, and other health professionals for whom a “do no harm” ethic intrinsically poses conflicts with research projects aimed at identifying and destroying those who are considered adversaries.

Next Steps

There are applications of operational neuroscience — such as improved prosthetic limbs for injured veterans and more effective treatments for victims of brain injury — that are compelling in their apparent value and their promotion of human welfare. But other applications raise profound concerns, especially where the defining goals and priorities of a university and its medical researchers and scientists diverge from those of national security and intelligence operatives. Community health sciences professor Michael Siegel — a graduate of Yale’s School of Medicine — emphasized this point when he was interviewed on Democracy Now! last month. Siegel noted: “The practice of medicine was designed to improve people’s health, and the school of medicine should not be taking part in either training or research that is primarily designed to enhance military objectives.”

In this context it’s worthwhile to recall exactly who Morgan envisioned as the trainees for his proposed “people skills” interview project at the medical school: U.S. Special Forces, the highly skilled soldiers often assigned the military’s most difficult and dangerous missions. These forces — over 60,000 strong including military personnel and civilians — are now covertly deployed around the globe. Journalists Dana Priest and William Arkin have described them as “America’s secret army.” Their counterterrorism operations include intelligence-gathering missions and lethal raids — not only in Afghanistan but also in countries where the United States is not at war. They’ve been authorized to keep “kill lists” of individuals who can be assassinated rather than captured, and some have conducted brutal interrogations at secret detention sites. The Army refers to its Special Forces as the “most specialized experts in unconventional warfare.”

At this point, signs clearly indicate that a U.S. SOCOM Center of Excellence for Operational Neuroscience will not be coming to Yale. But it would be a mistake to assume that this research — and the very considerable national security sector funding it attracts — will not find another home. This is why it’s important that the current controversy not be dismissed without fuller engagement and discussion among all stakeholders of pressing practical and ethical considerations — before a similar project appears on another campus or resurfaces in a reconfigured form in New Haven. The prospect of all defense-related neuroscience research being conducted clandestinely by government or corporate entities — away from the public and expert oversight that universities can offer — is far from reassuring, so difficult issues like this must be tackled head-on.

One valuable next step would be an open forum at Yale. Dr. Morgan could have the opportunity to describe in greater detail the nature of his deception detection work and related projects — including his ongoing research in New Haven about which Yale recently claimed it was unaware. Other distinguished scientists, ethicists, and human rights experts could provide their commentaries. And community members, students, faculty, and administrators could offer their own perspectives and pose questions. Such an event would not likely produce consensus, but the sharing of information, the free expression of differing viewpoints, and informed debate are among the most vital functions of a university. Pending further developments, there are very good reasons to be concerned — and confused — about the recent twists and turns surrounding the proposed center at Yale. Many of the most critical questions still await answers.