I was also one of a small number of scientists—mainly ecologists, climatologists and meteorologists—who in the 1970s became concerned about the possibility of a human-induced global warming, based on then-new measurements. It seemed to be an important scientific problem, both as part of the beginning of a new science of global ecology and as a potentially major practical problem that nations would have to deal with. It did not seem to be something that should or would rise above standard science and become something that one had to choose sides in. But that’s what has happened.
Some scientists make “period, end of story” claims that human-induced global warming definitely, absolutely either is or isn’t happening. For me, the extreme limit of this attitude was expressed by economist Paul Krugman, also a Nobel laureate, who wrote in his New York Times column in June, “Betraying the Planet” that “as I watched the deniers make their arguments, I couldn’t help thinking that I was watching a form of treason—treason against the planet.” What had begun as a true scientific question with possibly major practical implications had become accepted as an infallible belief (or if you’re on the other side, an infallible disbelief), and any further questions were met, Joe-McCarthy style, “with me or agin me.”
Not only is it poor science to claim absolute truth, but it also leads to the kind of destructive and distrustful debate we’ve had in last decade about global warming. The history of science and technology suggests that such absolutism on both sides of a scientific debate doesn’t often lead to practical solutions.