Logical fallacies in assessing risks from climate change


Guest post by Hans Custers. Cross-posted at Planet3.  Nederlandse versie hier.

Fallacies of risk

We, humans, are not very good at estimating and weighing risks. Looking at the definition of risk, this is not so strange:

Risk = Probability * Effect

Most of us will have some understanding of both elements ‘Probability’ and ‘Effect’, but the combination of the two is rather abstract. In judging risks, we tend to focus on one of the two elements, and more or less neglect the other one. The figure below shows the difference between our perception of certain risks and their actual magnitude.

Risk perception and actual hazards

The definition of risk might suggest that it is always possible to calculate it, or give a quantitative estimate. But often, it’s not that easy. Sometimes, it can be difficult to define the exact “Effect”, and the parameter that can be used to quantify it. Effects can go from economic or financial costs to damage to nature, from a small decrease in well-being, to large numbers of casualties. Our judgment of risks depends a lot on the type of effect.

To make things even more complicated, the debate on risk often takes place at the border of science and politics, of logical reasoning and subjective judgment. Whatever we do to try and find objective parameters and criteria to assess and weigh risks, decisions what we do and what we do not find acceptable depend to some extent on value judgments. There are no 100% objective criteria to make these types of decisions.

It’s obvious that many fallacies can come up in this minefield for logic. Last week, Judith Curry blogged on the article “Fallacies of risk” by Sven Ove Hansson, trying to identify the fallacies in the debate on climate. In his article, Hansson seems to mainly focus on risks of (new) technologies, especially the ones with a low probability and large effects. Applying the same fallacies to the debate on climate is not as straightforward as it might seem. Curry seems to be making some mishaps. She ends up making quite a few comments on Hansson that totally miss the point. Here’s my attempt to improve on hers.

1. The sheer size fallacy

X is accepted.
Y is a smaller risk than X.
Y should be accepted.

This is all about context. Whether or not we accept risks does not only depend on their size, but also on the benefits of a technology or activity. And, sometimes, on the possibilities and impossibilities for prevention. Motorized traffic, for instance, is a major cause of unnatural deaths in many countries. A huge majority of the people accept this risk, because society would come to a standstill without traffic. Many other activities with a similar death toll would be considered unacceptable. Another example: in some types of work, risks are unavoidable, that would be unacceptable in other professions. However strong the prevention measures, on a construction site there’s always a risk that someone gets hit by a falling object. That’s why construction workers are required to wear helmets. But we would never tell an office worker to just wear a hard hat, if there’s something wrong with the ceiling in his office and some debris might come down. This type of risk is not associated with office work, so we will not accept it.

Curry suggests, without any explanation, that this fallacy is at the heart of the precautionary principle when applied to a complex problem. I think she’s totally wrong. I’ll come back to the precautionary principle later.

I don’t think this fallacy is very much present in the climate debate. Most people seem to realize that the risks of climate change are very hard to compare to other risks. Not necessarily because of the size of the risks, but more so because of many different effects associated with them and the scale of the problem.

In more detailed discussions, for example on one single effect of climate change, the fallacy shows up regularly. A recent example: on the Dutch blog it was claimed that climate change is not the main cause of flooding on earth right now. Which is true. But it is not a valid argument for accepting increasing risks due to climate change in the future. Sometimes, the fallacy is used in its most extreme form by opponents of clean energy and clean technologies: they’re not willing to accept any risk associated with things they don’t like.

2. The converse sheer size fallacy

X is not accepted.
Y is a larger risk than X.
Y should not be accepted.

Some of you may have noticed that one of the examples I mentioned in the previous paragraph is of the converse sheer size type. This is what Hansson says: “Several of the fallacies to be treated below also have a converse form, but in what follows I will only state one of the two forms (namely the one that gives an invalid argument for acceptance, rather than non-acceptance, of a risk).” That’s what I will do as well.

3. The fallacy of naturalness

X is natural.
X should be accepted.

There are still self-proclaimed skeptics who think that the warming of the past century is mainly natural. Because of this, they will say mainstream science overestimates the risk of climate change. But that is not Hansson means by the fallacy of naturalness. Hansson means that we don’t have to accept risks, simply because they are natural. We don’t see this fallacy too often in climate discussions, I think. Maybe the “CO2 is plant food” meme comes close.

The converse version could play a role in how we think e.g. about geoengineering. Of course, there are many real risks involved in human interventions in the climate system, but the unnaturalness might make people even more reluctant. Or am I the only one?

4. The ostrich’s fallacy

X does not give rise to any detectable risk.
X does not give rise to any unacceptable risk.

Often it is claimed that because there is no (statistical significant evidence) for a phenomenon, it does not exist. What does not exist, can not be dangerous. However, insufficient evidence for a hypothesis does not prove the null hypothesis. Absence of evidence is no evidence of absence.

Much to my surprise, Curry did not think this fallacy was relevant to the climate debate, whereas I would argue that it probably is one of the most common fallacies in that respect. For instance in discussions on extreme weather and other phenomena that might already be influenced by climate change, but for which the evidence to date may be inconclusive. We only have one climate system. It simply is not possible to quantitatively distinguish a ‘human’ and a ‘natural’ component in all the processes and events that are happening in this system. It is very hard to estimate the human influence on individual events, especially the ones that are rare. Even when there are good physical reasons to assume that there is such a human factor.

There are reasons to expect climate change causes more suffering and damage by tropical storms. There’s more energy in the oceans to fuel these storms, and sea level rise will affect storm surges. But it’s very hard to quantify how much these factors have contributed to effects of individual storms over the past years.

Ignoring factors that cannot be (yet) detected, may result in underestimating the risk.

5. The proof-seeking fallacy

There is no scientific proof that X is dangerous.
No action should be taken against X.

If one fallacy from this list is near the heart of the precautionary principle, it is this one. Hansson explains that scientific standards are different from those in risk management. And they should be.

We can borrow terminology from statistics, and distinguish between two types of errors in scientific practice. The first of these consists in concluding that there is a phenomenon or an effect when there is in fact none (type I error, false positive). The second consists in missing an existing phenomenon or effect (type II error, false negative). In science, errors of type I are in general regarded as much more problematic than those of type II. (Levi 1962, pp. 62–63) In risk management, type II errors – such as believing a highly toxic substance to be harmless – are often the more serious ones. This is the reason why we must be prepared to accept more type I errors in order to avoid type II errors, i.e. to act in the absence of full proof of harmfulness.

So, there we are: the precautionary principle, aka ‘better safe than sorry.’

Where a scientist – climate scientists are no exception, no matter how much some people want to believe differently – is cautious to come up with premature conclusions, we don’t want to see the same behavior in a risk assessment context, losing sight of potential hazards. This might explain why the precautionary principle is counter-intuitive to some scientists.

One might even say that scientific uncertainty on climate change adds to the risk. The better you know what is coming, the easier it is to prepare. Preparation can be very effective in reducing risks, even though we usually prefer prevention.

Asking for a 100% proof, is asking for the impossible. Popper’s philosophy says there is no proof. This applies to all sciences, but is even more clear in earth sciences. We only have one earth, so double blind controlled experiments are not an option. All we can do is weigh the full body of evidence. This has an implication for the ‘converse ostrich’: on one hand absence of evidence for hazards doesn’t mean something is safe, on the other hand safety can never be proven with absolute certainty.

6. The delay fallacy

If we wait we will know more about X.
No decision about X should be made now.

This one is obvious. If you want to avoid all risks that are not yet completely understood, you should never get out of bed in the morning. Well, as a matter of fact, even that wouldn’t help.

Even if there would be as much scientific uncertainty about human impact on climate as some so-called skeptics believe, that would not necessarily be a reason not to take action. Greenhouse gas concentrations keep growing while we wait, and so do the risks. To deny any risk at all, you would have to deny a whole lot of widely accepted science.

Hansson states that it is not always possible to resolve scientific uncertainty in the short or medium term. This certainly applies to climate change. There’s only one way to find out with a high level of certainty what a doubling of greenhouse gas concentrations will do to the climate: just letting it happen and then wait for several thousands of years.

And even then some uncertainties would remain. It’s an iron law in science: new results bring new uncertainties with them. New uncertainties could be seen as new reasons for delay. Hansson sees this fallacy as one of the most dangerous fallacies of risk, from the viewpoint of risk reduction.

7. The technocratic fallacy

It is a scientific issue how dangerous X is.
Scientists should decide whether or not X is acceptable.

No matter how important scientific knowledge is in our society, it should not rule the world. Scientists can determine the nature and magnitude of risk, but society should decide whether or not certain risks are accepted. Acceptance of risk is not just a matter of objective numbers, but also of value judgments. Different groups of people can have different opinions on this. Somewhat simplified: proponents of stringent climate policies tend to focus on environmental risks, opponents on economic risk.

According to Hansson, there is “a fairly general tendency to describe issues of risk as “more scientific” than they really are”. I think he’s right. This is just one of the ways in which politicians try to evade their responsibility for difficult decisions. It could be one of the reasons our society develops towards a technocracy. There is only one cure: emphasizing again and again that both science and politics have their own responsibilities and tasks.

8. The consensus fallacy

We must ask the experts about X.
We must ask the experts for a consensus opinion about X.

Sceptics might think they like this one, but I have some disappointing news for them. Hansson mentions the IPCC as a positive example:

The Intergovernmental Panel on Climate Change (IPCC) does this in an interesting way: it systematically distinguishes between “what we know with certainty; what we are able to calculate with confidence and what the certainty of such analyses might be; what we predict with current models; and what our judgement will be, based on available data.” (Bolin 1993)

Curry claims that Bolin’s ideas lost out to John Houghton’s push for consensus. However, Bolin’s ideas as cited above are very recognizable in the most recent IPCC reports.

According to Hansson, the search for consensus has many virtues, but it shouldn’t be an end in itself. A forced 100% consensus could either ignore minority opinions and thus underplay uncertainties, or end up as a watered down compromise: “Therefore, it is wrong to believe that the report of a scientific or technical advisory committee is necessarily more useful if it is a consensus report.”

The converse consensus fallacy would be the argument that every scientific minority opinion should be taken seriously, regardless of the evidence.

9. The fallacy of pricing

We have to weigh the risks of X against its benefits.
We must put a price on the risks of X.

It is simply not possible to put a monetary value on everything. Coral reefs, for instance, do represent some economic value – as a tourist attraction, and as an ecosystem essential to the survival of many marine species – but many people will argue the true value is way beyond that. Non-material value cannot always be expressed as an objective figure. It is the subjectivity of these values that can make the weighing of risks so complex.

Suggesting objectivity, for instance by putting a price tag on risks, doesn’t do justice to this complexity, because it takes no account of the sincere objections that people may have. To make a proper judgment, it is necessary to recognize peoples moral objections and dilemmas.

10. The infallibility fallacy

Experts and the public do not have the same attitude to X.
The public is wrong about X.

Experts can be wrong, no doubt about that. It is important to remain critical and not automatically assume that the public is wrong when they have a different opinion than the experts. On the other hand, “It isn’t necessarily fallacious to consider that thousands of climate scientists writing in peer reviewed journals might know more than you do about such a complex subject.” (Skeptico)

There is no natural law stating experts are always on the right side of a discussion. This also means that we cannot blame science for not being infallible and omniscient. Science has proven its huge value to our society, even though it is done by ordinary human beings.

Avoiding the fallacies

A polarized and emotional debate will never be free of fallacies. We have to accept this as a fact of life. Hansson advises academics to take part in these discussions, acting as independent intellectuals. I’m not so sure this is realistic. In a debate like the one on climate, independence or objectivity are usually not recognized in the same way by different parties. However, you don’t have to be independent to identify fallacies and confront the people who use them. If we really want to put the debate forward, that’s what we should do.

7 Responses to “Logical fallacies in assessing risks from climate change”

  1. Bill G Says:

    Three risks we underestimated:
    1. Japanese attack on Pearl Harbor
    2. The cut off of oil from the US ans West in mid 70’s
    3. The devastating impact of dumping CO2 into the atmosphere

  2. Dan Olner (@DanOlner) Says:

    Good stuff, but is the original argument actually called “logical fallacies?” Some of them aren’t. The “fallacy of price” is not really a fallacy in that sense, it’s just an argument against trying to see the whole problem as a price problem.

  3. Bart Verheggen Says:

    I actually put the title above the post (in “good” newspaper fashion where the headline is often written by someone else than the author of the piece), but I guess you’re right that some of these are actually not logical fallacies but rather material fallacies. Not sure the distinction is always crystal clear though.

  4. willard (@nevaudit) Says:

    > Not sure the distinction is always crystal clear though.

    I don’t think it is, but we do have a distinction between formal and informal fallacies:

    Theoretical discussions of fallacies have not produced an agreed-upon taxonomy, but there is a common set of fallacies which are typically used in the analysis of informal arguments. They include formal fallacies like affirming the consequent and denying the antecedent; and informal fallacies like ad hominem (“against the person”), slippery slope, ad bacculum (“appeal to force”), ad misericordiam (“appeal to pity”), “hasty generalization,” and “two wrongs” (as in “two wrongs don’t make a right”). In textbooks, authors may devise their own nomenclature to highlight the properties of particular kinds of fallacious arguments (“misleading vividness” thus designates the misuse of vivid anecdotal evidence, and so on.)


    Such distinction matters less than the capacity for a community to learn from such intellectual abuses.


    Speaking of fallacies, I stumbled upon an interesting paralogism this morning:

    Here’s the reconstruction:

    1. Dick is a lukewarmer.
    2. Lukewarmism is mainstream science.
    3. Mainstream science includes “the 97%”.

    Ergo, Dick belongs to “the 97%”.

    This assumes that all lukewarmers endorse lukewarmism, lukewarmism is well-defined, and that no specific property of a particular lukewarmer L excludes L from mainstream science.

    The trick, in that paralogism, is of course to hide that last bit. That Dick can be said to be a lukewarmer does not prevent him to hold for instance that climate sensitivity (CS) is between 0.5-1.3 (h/t andthentheresphysics). And since it would be tough to sell to an audience that someone who holds such a low CS as belonging to mainstream science, one has to resort to an intermediate attribution.

    “Dick is a lukewarmer” serves a very important function: it abstracts away Dick’s contrarian career, and it stretch the Overton windom so low as to include in mainstream science everyone except Sky Dragons and doomsayers.

    Buyers beware.

  5. Paul Kelly Says:

    + Dan Olner
    “not really a fallacy” could describe several even as material fallacies.

  6. gaia.sailboat Says:

    The arguments of “Fallacies” is really cool, especially the formulas.

    Now can somebody please divert billions of gallons of water to the western US please. Cattle ranching is shutting down. Ground water is pumped dangerously low, It is strangely dry every day. Each year breaks temperature records of all recorded time.

    How long will we debate “logical fallacies”? Until the last drop of water and morsel of food? And maybe the anti-global warming argument will score more debating points and win! Something to look forward to.

  7. John Carter Says:

    Late comment, just came across your blog.

    “Curry seems to be making some mishaps.”

    That assessment doesn’t surprise me.

    Relevant or not, she also took issue w/ the 97 consensus in a pretty strong way. My question for her http://judithcurry.com/2014/07/27/the-97-feud/#comment-612039 apparently still not answered (though I used anthropomorphic. woops.)

    She misconstrued the relevancy (or lack, or meaning) of the pause http://judithcurry.com/2014/07/29/politicizing-the-ipcc-report/#comment-614068

    And though every article I have seen there takes a one sided view, she started off a recent post with a john stuart mills quote about how one has to know both sides of issues, then proceeded of course to once again effectively only support the cc refutation side for the remainder of the post. In a post days earlier. she seeemed to suggest it was good that the [Heartland institute created] NIPCC was getting relevant consideration as a balance to the IPCC (or some such.) and that it was good the “Contrarian position” was being considered.

    After reading many posts all over the place and thousands of commenters, I can’t figure out any contrarian positions other than “the climate is not warming so CC is not real” (false in first instance and erroneous conclusion therein in any case, in the second) “the earth has changed before so therefore we’re probably not changing it now” (illogical) the ‘current warming ‘happens all the time’ (marcott et all would beg to differ, but even if occasionally, the current 100 trend is extremely unusual statistically, and also, how often it has happened is secondary or even irrelevant to lessening the idea that our radical atmospheric alteration would therefore have a lesser future impact) “we can’t know the future, “climate science is uncertain,” “climate scientists can’t be trusted” (Not exactly a refutation of CC), and an ongoing similar host of things that don’t really refute or have much to do with CC.

    So I posed this comment directly for Ms. Curry, http://judithcurry.com/2014/08/06/importance-of-intellectual-and-political-diversity-in-science/#comment-615976 in it, asking, what exactly that contrarian position was, emphasizing that, so we could actually discuss it. ‘

    Naturally, it wasn’t remotely answered by Judith Curry, or by anyone, though there were several replies to it. The best evidenced exactly what my comment was accused of: My “narrative” was “irredeemably simplistic nonsense” [if the denier uses complex prose the must be smart, and so he must not be wrong on his skepticism]; And it was an “example of science distorted in the crucible of groupthink.” [super ironic to say the least, I’ll leave it at that.] “The science problem is not simple radiative physics but a deterministic chaotic system. The policy question is what to do about it in a rational policy framework.”

    This from a commenter who essentially argues (at other times) that cc is all but a farce, and that the oceans are not warming.

    Incidentally, I came across your blog bc i just saw this post http://climatechangenationalforum.org/quantifying-the-consensus-on-anthropogenic-global-warming-in-the-scientific-literature-by-dr-john-cook-et-al-97-1-agree-that-humans-are-causing-global-warming/ at climate change national forum, and I was going to write something similar to your comment before realizing I couldn’t log in anyway. (Then i saw your comment so wanted to write and say I was going to make a similar argument about the impracticality of separating out a scientists views based on their own completely independent evaluation, or their evaluation as scientists,)

    And by the way, aside from “we don’t believe that there is a legitimate threat of future climate impact, or one significant to do anything about other than maybe stuff we want to do anyway” I still really don’t know what the contrarian “argument actually is.

    I think it is that they think that there is a physics presumption against change, and so the “burden of proof: for the world to believe there is any kind of threat is on scientists to ;prove it,” And that scientists haven’t proven it (and by their definition it can’t be proven until after the fact). But rather than basic science assessment that seems more like effective advocacy for a predetermined position.

    Which, of course…….

    Anyway, Bart, if (whenever) you see this and want to follow up, shoot me an email (Its posted on my site which I think this links to) or post a comment I checked to be notified of follow up comments.

    Also, btw, of your fallacies above, I think everyone gets no. 6. In general. The fact that CC”skeptics” dont in this context does suggest something fairlystrong about the way they are viewing the issue. (Of coufse to see this they would have to realize the fallacy, but the whole catch 22 is they don’t, or wont. But others of influence can, and the media should be able to…. .

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: