STOP CASSINI EARTH FLYBY Probability and Plutonium Don't Mix |
To us, probability is the very guide of life." --Joseph Butler, English bishop and theologian (1736) It seems as if NASA has adopted Bishop Butler's statement as its motto, first for space shuttle Challenger and now for Cassini, a space mission to Saturn that will carry 72 pounds of plutonium 238 batteries. Cassini, scheduled for launch Monday, is to fly by Earth in August 1999 at an altitude of 500 kilometers and use Earth's gravitational force to fling itself to Saturn. This is the most dangerous phase of the mission. If control is lost during the flyby, the craft would crash into the atmosphere (or Earth) and its radioactive load could widely scatter. Regardless of its questionable safety, and precisely because of the uniqueness of its mission, the Cassini venture begs the moral question: Why should thousands of innocent people who won't gain anything from it, such as those in southern Africa and Madagascar, be subjected to health risks just for the intellectual curiosity of a few zealous scientists or the miscalculation of a handful of "techies" in the West? NASA has estimated the odds of a serious accident--"Earth impact"--to be eight in 10 million. However, catastrophic accidents have an eerie way of defying wishful expectations and unrealistic probabilities. Cassini's probability of failure is reminiscent of an estimate of the failure rate for Challenger's solid fuel rocket booster, which NASA argued was very low. NASA has rendered Cassini "spacecraft software errors, erroneous ground commands and navigation design errors" as "insignificant contributors" to an Earth impact and has conveniently dismissed their (and other factors') potential roles in such a failure. This mind-set guided NASA managers and decision-makers in their ill-fated path toward the Challenger launch. Challenger's explosion in January 1986 proved that these failure rates and NASA's other calculated reliability figures were questionable. The late Nobel physicist Richard Feynman, in his seminal "minority report" in the report of the presidential commission on the Challenger accident, analyzed the underlying causes of these miscalculations, concluding: "It would appear that, for whatever purpose, be it for internal or external consumption, the management of NASA exaggerates the reliability of its product, to the point of fantasy." He suggested that "[we] make recommendations to ensure that NASA officials deal in a world of reality in understanding technological weaknesses and imperfections well enough to be actively trying to eliminate them." Unfortunately, the same mind-set is still at work and NASA has not yet understood the worrisome weakness of "probabilistic risk assessment" techniques when it comes to human and organizational errors. These techniques, despite their appeal to bureaucrats and their seeming political correctness, suffer from inherent theoretical imperfections. Furthermore, the research community around the world has yet to come up with a standard approach for quantification of human, management and organizational factors that influence systems' safety. There are several organizational units at NASA, at the Jet Propulsion Laboratory, at the launch pad in Cape Canaveral and at tracking stations around the world that have direct or indirect impact on the control of the Cassini while it's in space. There could be chances for errors caused by the lack of synchronization between these units, as well as errors in the software, ground commands and navigation design. NASA has not sensibly addressed these issues. NASA points to its successful experience (e.g., Pathfinder, Galileo and Voyager) to help them avoid a Cassini accident. Feynman referred to this as "the argument that the same risk was flown before without failure is often accepted as an argument for the safety of accepting it again." However, research has shown that we cannot rely solely on the experience level of the operators to prevent future accidents. The way that NASA "manages and organizes" space systems should be given equal (or even sometimes more) priority than other technical or hardware-related considerations. The human and organizational factors are complex, and unlike hardware or equipment problems, there is no convenient, quick fix for such problems. We can and should be able to operate our space systems safely; we have no other choice. However, NASA needs an overall culture change in dealing with the reliability of space systems. NASA should proactively address the non-hardware-related factors and, under the oversight of the National Research Council, conduct a realistic total system safety analysis of Cassini. It is not too late, even after the launch, to prevent an accident that could happen during the flyby in 1999. Finally, if NASA wants to continue its fixation with probabilities, at least adopt a more sound motto, based on a statement by Laplace, the 19th century French astronomer, mathematician and father of probability theory: Probability demonstrates both our knowledge and our lack of knowledge. Najmedin Meshkati is an Associate Professor of Civil/environmental Engineering and Industrial and Systems Engineering at USC |
This page was updated: December 26, 1997 by The NoFlyby Webmaster
This site is hosted on The Nonviolence Web and is an affiliated Project of Traprock Peace Center (103A Keets Road, Deerfield, MA 01342). Together we explore nonviolence, foster Community, work to end war, promote communication and take initiatives on environmental and justice issues.