NewsWatch

Premature Explanations

When it comes to 'Gulf War syndrome,' journalism pre-empts science

by Jeremy Torobin and Howard Fienberg
December 7, 1999

The flurry of media coverage about the latest ‘development’ in the quest to explain why up to 100,000 Gulf War veterans have felt chronically ill since their combat tours is a good example of differences in how the worlds of science and journalism operate.

Specifically, there is a conflict between scientists’ need to test theories and conduct experiments slowly, carefully and multiple times before drawing conclusions, and journalists’ need to satisfy editors and keep up with competitors by breaking important new developments that answer long-puzzling questions.

Last week, the authors of a study funded by the Pentagon and the Dallas-based Perot Foundation reported that some Gulf War veterans who have complained of illnesses ranging from fatigue and muscle pain to memory loss actually have signs of brain damage, most likely caused by exposure to toxic chemicals. "There’s hope, now that these guys have a disease," the lead researcher, Dr. James Fleckenstein, a professor of radiology at the University of Texas Southwestern Medical Center in Dallas, told Associated Press. "They can be believed – they’re not malingering, they’re not depressed, they’re not stressed. There’s a hope for treatment and there’s hope for being able to monitor the progress of the disease."

If this is indeed the case, the findings could prove very significant. To date, almost every notable scientific, epidemiological and medical study has failed to find evidence of an all-encompassing ‘Gulf War syndrome.’ In the latest study, Fleckenstein and his colleagues found that the 22 ill veterans they examined have between 10 and 25 percent lower levels of a brain chemical called NAA, suggesting a loss of neurons in the brain stem and basal ganglia, than the 18 healthy veterans in the study. The researchers then repeated their experiment in a sample of six more veterans.

But as most news organizations who covered this story were careful to point out, the original and follow-up samples were quite small, and the study has yet to be subjected to peer review in an academic journal – the traditional barometer for a study’s reliability. And though the authors speculate that a combination of exposure to chemical nerve gas, insect repellants and side effects from anti-nerve gas tablets could be to blame, they didn’t test for specific links in the study. Yet, they claim that the work is a validation of earlier research which found that some sick Gulf War veterans had a genetic predisposition for brain damage because they were born with low levels of the enzyme that breaks down chemical nerve gases like sarin.

Moreover, none of the major print and broadcast media outlets who covered the findings noted that three ‘Gulf War syndrome’ studies that Fleckenstein’s colleague and current study co-author Dr. Robert Haley had published in the prestigious Journal of the American Medical Association in 1997 were criticized for their scientific methods and conclusions.

But even though news organizations were generally clear about the potential flaws with the current study, it’s worth asking where the value is in flocking to cover findings that most scientists would agree have not been repeated often enough, nor with enough subjects, to come close to certainty. If science reporters and their editors hadn’t learned that the findings would be presented at the annual meeting of the Radiological Society of North America in Chicago, would they have bothered to go?

Perhaps journalists covering long-term science and health stories need to re-think their reflexive impulse to find closure, even when it isn’t there. Every time news organizations give credibility to studies that claim to ‘prove’ something, only to be ‘disproved’ by further study (the realization that there was "good" and "bad" cholesterol, for instance), they undermine public faith in both science and journalism.

Jeremy Torobin is associate editor of NewsWatch, and Howard Fienberg is research analyst for the Statistical Assessment Service


return to Howard Fienberg's page