“In accordance with Bayes’ theorem, prediction is fundamentally a type of information-processing activity – a matter of using new data to test our hypotheses about the objective world, with the goal of coming to truer and more accurate conceptions about it.” Nate Silver
I like Nate Silver and I feel bad about criticizing his book “The Signal and the Noise”. But after reading his global warming chapter twice through I have to admit that he deserves the criticism. Without a doubt, when he writes about the physics of climate change for the most part he is spot on. And I applaud his effort to try to understand it. However, my critique is technically valid, no apologies there; Silver blew his Bayesian analysis. He missed a perfectly good opportunity to contribute to a reduction of the noise level of the discourse increasing it instead. This is exactly the goal of the fossil fuels industry. So long as there is lots of noise, nobody sees the signal. I sent a draft to Silver and asked for comments so he has had a chance to defend himself or do the mea culpa. He did not respond so my conscience is clear. The mistakes he makes in applying Bayesian analysis to the global warming problem include:
- He uses false new data introducing misinformation, which has been comprehensively debunked. See ref .
- He ignores true information.
- He assumes that the climate system is memoryless.
- He only applies the analysis to the IPCC consensus hypothesis and not to the myriad denier hypotheses.
I have been aware of Bayesian statistics most of my life but never had the opportunity to apply this very useful math. If Silver hadn’t bungled it, I would not have invested the time and energy over the last few weeks digesting it and applying it. In that sense reading his book was a very positive experience for me. I think Bayesian inference if applied correctly is a powerful tool to help us understand the myriad problems we face. So I owe Silver a great debt. Bayesian statistics reveals that the science of human-caused global warming is virtually certain precisely because applying it to the various denier hypotheses reduces their probabilities to some very small values. We need to be reminded that a skeptic is not somebody who questions the IPCC reports. A skeptic is somebody who questions the IPCC reports and the denier arguments. The IPCC reports survive handsomely while the denier hypotheses melt away like an ice cube on a hot summer day. Using Bayesian analysis and the information Silver had available, I estimate the probability of the truth of Human-caused global warming to be about 98.7 percent. Using more recent scientific data and the probability increases to 99.7 percent. The probability that the Earth is cooling may be something like 1 in 200 million billion given the exceptional global temperature records set between 1995 and 2012.
As much as I regret having to criticize Silver, I’ve no hesitation criticizing the Chicago School economist Armstrong, upon whom Silver wastes so much valuable ink. The latter is a “forceful agent”, misleading, ideological and promoted by the fossil fuels industry. At any rate, even if Armstrong’s economics were not questionable, economics itself is irrelevant to climate physics. Silver may have found it entertaining to include so much discussion of Armstrong for reasons we will discuss, but it introduces noise into his analysis for no benefit.
By the way, another matter for another day, but It is way past time to apply Bayesian analysis to economics especially the laissez-faire free market ideology of Milton Friedman because the cost of addressing global warming (if there is any) if we were to do it very soon is a very minor inconvenience, perhaps a small forfeiture of possible near term gain, whereas the cost of not addressing the destructive influence of the Chicago school is a huge real long term loss of wealth; consider the 2008 Great Contraction. Silver is aiming at the wrong target. He is just not enough of a skeptic.
This article was a lot of work, albeit very enjoyable, but I have to get back to the rest of my life. So my promised article on Big Data, though partly written will probably be a while in coming. But print this article out and you can read it in your spare time over the holidays in bits. Comments and criticisms are always welcome.
All of the myriad decisions we have to make are plagued by uncertainty. We never have complete information and every decision is a “probabilistic gamble based on some kind of prior information” . If we improve our information in principle we can update our probabilistic understanding, reduce uncertainty and improve our decision making. This is true provided the information is true and is relevant to the decision. Misinformation will increase the uncertainty and some information is independent of or irrelevant to the decision we have to make. Knowing who won the American League batting title in 1962 is useful information to have in a sports bar but unlikely to be relevant to decisions we have to make about our investments or our health.
A body of research on misinformation and how it spreads exists . Daron Acemoglu, Asuman Ozdaglar, and Ali ParandehGheibi describe misinformers in society as “forceful agents” who “influence the beliefs of the other individuals they meet, but do not change their own opinion.” If we assume misinformation is true it increases our uncertainty and increases the likelihood that we (society) will make bad decisions. This is what happened when President Bush convinced himself that Iraq had weapons of mass destruction. Misinformation once embedded is difficult to correct . Many Americans still believe that Iraq had WMD. And evolution denial is still prevalent after more than 160 years of its discovery. It is mathematically consistent to say that misinformation increases entropy or uncertainty and reduces our ability to make good decisions.
Use of the word uncertainty causes confusion as does the use of entropy (whether we are discussing heat or information). Probability theory defines an objective uncertainty about a signal. This can be calculated. An individual might have subjective certainty, as Bush did regards Iraq’s nonexistent WMD, without any knowledge whatsoever or even knowing a considerable amount of misinformation, stuff which isn’t true, Rumsfeld’s fourth category. People can be certain about noise. The signal, that Iraq did not possess WMD, was available before the war but it was buried in the noise and the misinformation, in this case the fog of war.
In the information theory literature H is used to represent uncertainty. The expression “the uncertainty of x” is written as H(x) where x is the set of possible hypotheses or beliefs. If we discover some new information y we can reevaluate the uncertainty of x given the new information y as H(x|y). the “|” reads as “given”. If we discover yet more information z then we can reevaluate the uncertainly of x and write this as H(x|yz). If both y and z are true information then we have the relationship: H(x|yz) is less than or equal to H(x|y) which is in turn less than or equal to H(x). This is important enough to write out:
In equation  the sum is over all the beliefs x and P is the probability of each x. The log is in base 2. All of the probabilities have to add to one. If for one of the values of x the probability is one then for all the other values the probability is zero and the uncertainty is identically zero as one would expect. When one of the probabilities is very close to one and the rest are all very small the uncertainty is low and when the probabilities are all equally likely then the uncertainty is a maximum. Table 1 gives the uncertainty for a two-hypothesis problem, in the example the probabilities of whether Iraq did or did not have WMD. We will use both of these equations later.
There are two lessons to be gleaned from the example described in Table 1. The first is that true information can reduce objective uncertainty. The second is that while misinformation can reduce subjective uncertainty it can also increase objective uncertainty and lead to disastrous results. It is helpful to recall Donald Rumsfeld’s famous remark that there are three categories of information: the known knowns, the known unknowns and the unknown unknowns. But there is the fourth category of information, what I call the Rumsfeld category: what we know for sure which just ain’t so. This category is misinformation and is what gets us into trouble. If we introduce misinformation into our analysis we will compute a ridiculous probable estimate as a result.
The statement that if we take into account new information in our decision process we will be able to make better decisions “on average” is mathematically rigorous. This is an informal statement of Bayes Rule or Bayesian inference. Pierre Simon Laplace is also correctly credited with its discovery and application  and one can find useful information looking up Laplace’s inductive probability as well. It sounds simple and obvious and we all like to think we always apply it in practice. Confirmation bias and motivated reasoning tend to get in the way. We simply ignore new data which disagrees with our preconceived notions rather than update our opinions. We accept misinformation which happens to support our opinions. It is in this way ideological thinking gets us into trouble. We must always be willing to be wrong if the new data reduces the probability that our prior belief was correct or supports an alternative hypothesis. Sadly we all too often are unwilling to admit we were wrong and give up prior beliefs. There is lots of science which supports this view, and indeed Daniel Kayan has recently added yet another research paper confirming our inherent bias even when staring at evidence which contradicts our prior opinions . Smart people make this mistake more often because they are better at inventing clever justifications despite the evidence.
Sharon Bertsch McGrayne has written an awesome book about the history and more famous applications of Bayes Rule “The Theory That Would Not Die” . I can also recommend Nate Silver’s book “The Signal and the Noise” , which describes applications of the theory. Silver includes a chapter on global warming but makes several critical mistakes in applying Bayes rule. We can learn from these mistakes and apply Bayesian inference to update our knowledge of human-caused global warming correctly. I recommend Wikipedia  and several books on Information Theory [8 and 9] to understand the mathematical formalism. You might also enjoy the Cambridge University lecture series  of information theory by David MacKay author of the books Sustainable Energy – Without the Hot Air  and Information Theory, Inference, and Learning Algorithms . Both of these books are entirely relevant to this discussion.
In order to apply Bayesian inference to any problem we first need an initial assessment of the probability of our hypothesis (or belief) P(b,y), in this case the “joint probability” of the truth of our belief b with the truth of existing evidence y. This “marginal probability” is called a prior belief. When a new event occurs such as the publication of new data in a scientific paper we can reevaluate the probability. To do this we need to evaluate the likelihood of the new data occurring assuming the belief b is true and the previous information y is true which we write as P(z|b,y) and which reads as the likelihood of the new data z occurring “given” the belief is true and y is true. My reason for stressing the existence of the prior information y is to point out that our current assessment of the truth of the various beliefs is contingent on this existing knowledge and is not derived in a vacuum or based on gut instincts. If we assume y is true we can simply write P(b,y) = P(b). We can then calculate a joint probability P(b,z) = P(z|b) P(b). What we want to compute is the likelihood that b is true given the new data z which is written as P(b|z).
The new probability which we will calculate based on the new information is called the posterior probability. To complete this calculation we need to estimate the likelihood of the new data assuming the belief is false as well as assuming the belief is true. This means calculating this likelihood across all alternative beliefs in turn. And since these alternative beliefs may be at odds with each other as much as they are with our own belief we cannot treat alternative beliefs as an aggregate. And in any case we are just as interested in reevaluating the probabilities for all the alternative beliefs.
The joint probability of b and z is given by P(b,z) = P(z|b) P(b) and by P(b,z) = P(b|z) P(z). Therefore the probability of P(b|z) = P(z|b) P(b) / P(z)by simply algebra. The probability P(z), i.e., the probability of the new data is equal to the sum over all of the possible beliefs of the joint probabilities P(b,z) for all values of b. This becomes:
Application to human-caused global warming
Our hypothesis that human-caused global warming is true has been questioned by the apparent “pause” in global warming since 1998. This has been a persistent denier argument and is discussed by Silver, and it is the “new” data Silver applies in his Bayesian analysis assuming that the temperature cooled during the decade between 2001 and 2010.
Climate scientists have attributed the apparent “pause” in part to an increase in sulfur dioxide emissions from China’s coal fired power plants. China’s economy has been on hyper drive since 2000 driven by coal. Atmospheric sulfur dioxide reflects solar energy back into space and therefore has a cooling effect on the Earth surface but it has a short residency time relative to carbon dioxide and rains out quickly. Silver mentions this. Another observation is that the oceans have been absorbing the lion’s share of the heat without any pause . This is shown in Figure 1. As even former denier turned acceptor Richard Muller pointed out, land surface temperatures have continued rising unabated . Figure 2 shows Muller’s analysis of the measurements of the Earth land surface temperature record and there is no pause. Also every year during the decade (2001 – 2010) was warmer than every year prior to 1995. Furthermore the 1980’s were warmer than any prior decade since temperature records have been kept and the 1990’s were warmer than the 1980’s and the aught’s were the warmest decade ever recorded and by a huge amount, a whopping 0.2 degrees Celsius! Given this fuller information the discrepancy of missing heat is not that remarkable. To claim there has been a “pause” is misleading. To claim that the surface temperature has fallen during the decade as Silver does is false as shown by the faint red trend line in Figure 3, below. In this way, Silver introduces misinformation and noise into his Bayesian analysis.
Yet another factor and one which I’ve discussed in several articles  is that when the Earth warms the Arctic warms at an accelerated rate. While this is an expected outcome of global warming, there are no thermometers in the Arctic and no temperature record. The white areas in figure 3 are the places on Earth which do not have thermometers and for which no records exist. The U.K Meteorological Office chose to simply ignore the Arctic and the other areas in its computation of global temperature. NASA uses an “optimal interpolation algorithm” to estimate the Arctic temperature . In 2008 climate scientist Rasmus Benestad  discussed the implications of both methods on the global temperature estimates. If a more accurate method could be found to determine the Arctic temperature then the global temperature calculation would be improved and he postulated that the “pause” might disappear from the global record.
A new paper published this week in the Quarterly Journal of the Royal Meteorological Society by Kevin Cowtan of the University of York and Robert Way of the University of Ottawa does just that . Using satellite data which has existed even for the Arctic since 1979 to better estimate the temperature the authors find that the global surface warming since 1997 has happened more than twice as fast as the U.K. Met Office “HadCRUT4” estimate. The new results are shown in Figure 3. Notice that the global warming “pause” does indeed disappear. 1998 is a little cooler and all subsequent years a little warmer. The new bold red trend line is steeper than the older faint red line. The pause is gone. Silver does not discuss the Arctic at all so in effect his analysis has been blindsided.
We will use these new results by Cowtan and May as our new data to update our estimate of the posterior probability of human-caused global warming after correcting for Silver’s mistakes in his analysis. In order to determine our prior predictions we use the most authoritative and recent source. The most recent IPCC report AR5  which was published this year states: “It is extremely likely that human influence has been the dominant cause of the observed warming since the mid-20th century.” “Extremely likely” in this context is to be interpreted to be with 95-100% probability. We can therefore assume human caused global warming is true with at least 95% probability. Silver makes the same assumption for prior probability of the truth of human-caused global warming.
All of the alternative hypotheses which deny AGW have an aggregate probability of less than 5%. However, it is wrong to think that there is only one denier hypothesis. In fact there are many and they contradict each other. While Silver aggregates deniers into one hypothesis we consider two broad contradictory denier beliefs as a better approximation. Some deniers maintain that the “earth is not warming” and others recognize that the Earth is in fact warming but not from human causes.
The former category includes those who do not believe carbon dioxide is a greenhouse gas and also those who accept that carbon dioxide is a greenhouse gas but then assume there is some as yet unidentified attenuating feedback mechanism in play which is balancing the warming. There are also those who believe in a metaphysical cause such as only a deity can change the climate. And then there are those such as J. Scott Armstrong, a Chicago school economist (marketing) and forecaster at the Wharton School and an “expert” at the Heartland Institute, a denier think tank. These economists believe that the market is efficient and if the Earth were really warming then the invisible hand would fix it without the help from government. Armstrong told a congressional panel in 2011 “I actually try not to learn a lot about climate change. I am a forecasting guy.” Armstrong is funded by the fossil fuels industry through the Heartland Institute as Silver informs us in a footnote. Armstrong loses his funding if he changes his mind.
We care about Armstrong only because Silver interviews him at length in his chapter on global warming despite his admitted ignorance (he gets as much space as the climate scientist Gavin Schmidt and more than Michael Mann). Silver might instead have asked Armstrong if he forecast the 2008 Great Contraction because I don’t think he did . Otherwise, Armstrong is just noise.
Another Chicago School economist Edward Wegman was asked by Illinois Democrat Jan Schakowsky if he knew how carbon dioxide traps infrared heat in our atmosphere. Rather than admit he didn’t know Wegman replied  “Carbon dioxide is heavier than air. Where it sits in the atmosphere profile, I don’t know. I’m not an atmospheric scientist to know that. But presumably, if the atmospheric – if carbon dioxide is close to the surface of the earth, it’s not reflecting a lot of infrared back.” More noise.
Armstrong’s opinion that he does not need to understand climate physics in order to forecast future climate reminds us of Friedman’s statement  that, “Truly important and significant hypotheses will be found to have assumptions that are wildly inaccurate descriptive representations of reality and, in general, the more significant the theory, the more unrealistic the assumptions.” This was not a one-off comment but the foundational principles of the Chicago school of economics. Friedman wrote it in a 40 page paper published in 1953 attempting to justify his wildly inaccurate assumptions. Homo sapiens have accomplished many things such as the invention of the steam engine and the internet and have discovered the workings of the stars as well as evolution and human-caused global warming. We owe these discoveries to the scientific method and our insatiable curiosity. On the flip side we’ve created quite a bit of ideological noise of which the Chicago school of economics is a good example. In his paper Friedman explicitly takes the science out of economics much to the dismay of the more rational economist Paul Samuelson who wrote of Friedman’s paper : “I regard it as a monstrous perversion of science to claim that a theory is all the better for its shortcomings; and I notice that in the luckier exact sciences, no one dreams of making such a claim.” In fact there is a high correlation between laissez-faire free market ideology and denial of human-caused global warming ; both are faith-based, both are independent of evidence, both are probably wrong and both are just so much noise.
Including an interview with Armstrong is a serious flaw of Silver’s book. In Silver’s defense he does acknowledge Armstrong’s ignorance and he also correctly ties Armstrong to an ideological organization which is funded by the fossil fuels industry, but only in a footnote. Paraphrasing Mae West there is no such thing as bad publicity for misinformation.
I can understand why Silver finds Armstrong’s story compelling. Silver is a former professional poker player and a forecaster. Armstrong is a forecaster and has written one of the authoritative books on the subject. Further, Armstrong offered a bet to Al Gore, which Gore, correctly, ignored. It is a no win situation for the reality which Gore represents and it is near term anyway. If Gore wins, the press will ignore it as they largely did Richard Muller’s remarkable mea culpa. And if Gore loses, the fossil fuels industry will make sure everybody knows about it. In any discussion of economics or politics, only a fool ignores the influence of money. In any case, while Armstrong’s bet is compelling to Silver it is irrelevant to the science. Science is not settled by bets; it is settled by evidence.
Silver points out that after applying Bayes Rule dozens of times we are likely to converge on the same posterior probabilities regardless of our initial priors so long as we do not assign 0% or 100% probability to any hypothesis. This is not quite accurate. We are always better off starting with the best available information and the most accurate priors. And there are two cases where Silver is incorrect to advise that we cannot assign a probability of exactly “zero” to an opinion. Wishful thinking about government deregulation has no physical influence on the Earth’s climate system. Whether the “invisible hand” or the efficient market applies in the market place is doubtful but it certainly doesn’t apply in the physics of climate. Economics is as irrelevant to climate physics as Pete Runnels 1962 American league batting title is to car insurance. Figure 4 shows the destruction of Typhoon Haiyan in the Philippians. They had some kind of economy there, laissez-faire free market, soviet style central planning, piracy, cooperatives, crime-based, socialism, charity, some combination or something else entirely; Mother Nature does not care. Sustained 195 mph winds treat them all equally. Climate and weather depend only on physical laws and not human ideologies.
It is also safe to assign a value of zero to the deity hypothesis .
The second denial category includes those who believe the warming is due to increased solar activity, those who think it is due to cosmic rays, those who think it is due to unspecified natural variation and of course, those who believe the earth is warming for some metaphysical reason or even some undiscovered physical reason. I was once informed that the Earth was warming because our planet has a moon. No physical explanation was given when I asked.
Since the cosmic ray hypothesis of global warming is at least a physical explanation if it is not likely, we should assign a non-zero value to its validity. Figure 5  suggests that the cosmic ray hypothesis has no legs. Figure 6  suggests that variation in solar radiation has no legs either. Since human emissions correlate extremely well with the accumulation of atmospheric carbon dioxide and this accumulation correlates extremely well with the recent rapid rise in Earth’s temperature, and since these observed results correlate extremely well with physical laws, deniers have emphasized that correlation does not imply causation, ignoring the fact that lack of correlation doesn’t imply it either.
Silver assigns a value of 15% as the likelihood that the cooling aught’s decade is consistent with the human-caused global warming hypothesis and 50% as the likelihood that it is consistent with the aggregate denier hypotheses. Both calculations are wrong. As we’ve seen even before the publication of the Cowtan/May result, Earth surface was warming during the decade.
Silver should have calculated the probability of three decades in a row setting records. Assuming the consensus hypothesis, if we take Silver’s 15% at face value then there will be 3 decades out of 20 in which the temperature is flat or falling. We can assume, like Silver, a binomial distribution and that those decades can either cool or warm and each event is independent. If we wish to constrain our assumption to the first three decades of our test having this characteristic then the probability is (0.85 * 0.85 * 0.85) 61% not 15%. The probability that the first three decades are increasing for Silver’s unspecified denier hypothesis is (0.5 * 0.5 * 0.5) 12.5% not 50%. In fact it is worse than that because the climate has memory.
Silver calculates that the probability of human caused global warming hypothesis being true falls from 95% to 85%, a remarkable 10% drop on a single data point. Silver should have realized his mistakes because this is a ridiculous result. The existing evidence for human-caused global warming, y in equation  above fills volumes and is extremely robust. We assume all the original information y is still true as Silver’s “pause data” does not contradict it. But the uncertainly H just went from 0.29 to 0.61 using equation  as shown in Table 4. In other words Silver introduces “noise” and our uncertainty increases as expected.
If new data were published tomorrow in a credible scientific journal which questioned the IPCC hypothesis in some way, it could at best only move the probability downward by fractions of a percentage point and it would have to address the paleoclimate results and the underlying physics such as perhaps finding a flaw in quantum mechanics. One would have to explain which of the denier hypotheses became 10% more likely, the invisible hand or the deity, and why. Real skeptics have very well defined testable concerns based on physical arguments and these can and deserve to be addressed by the science and in fact, are being addressed by the science as Silver makes clear in his interview with the climate scientist Richard Rood. Legitimate skepticism which can be addressed by the science is being drowned out by the noise created by the vast majority of deniers. It is not the climate science community which is ignoring legitimate skepticism but the deniers themselves. For the fossil fuels industry and the economists, politicians and lobbyists who feed at that trough, there is no benefit to legitimate scientific argument which can be easily addressed by the science. Deniers have no control over the signal; they can only control the noise.
If we recalculate Silver’s example using the more accurate data and treating the denier hypotheses as two distinct opinions but still assuming a binomial distribution the posterior probability of human-caused global warming goes from 95% to 98.7% and of course that is more reasonable; the uncertainty is reduced (see table 4). The IPCC report already assumes a range of 95 to 100% probability for the science-based hypothesis. So it is quite reasonable that any new science on any given day can drive the probability of aggregate denial to some “epsilon greater than zero” because it may already be there.
It is instructive to do this analysis using extreme event statistics. The temperature record using thermometers goes back to 1880. The global Earth surface temperature for the 1880 decade, being the first, set the record for both the warmest and coldest decade on record. The 1890’s were either the warmest or coldest to date. The 1900’s may have been the warmest or the coldest or may have fallen in between the earlier two. Obviously, early in the record keeping it would have been more likely to set a record than later, because the records would continue to be more difficult to exceed. It turns out that there is a body of science that studies the statistics of extreme events . In fact, climate change has made this work very important. Using formulas in  the probability of three decades in a row setting maximum values, i.e., being warmer than all of the previous decades is 1/11*1/12*1/13 = 0.06%. The data is available on Wikipedia  and repeated in Table 2. We can do this for years as well. Reference  gives general formulas for deriving the “probability of observing in N (116 years before 1996) trials, k (15 years exceeding the 1995 temperature record) exceedances of the mth (1st as 1995 was the warmest temperature on record at the time) ranked observation seen in n (17 years since 1995) trials.” Table 3 lists ten warmest years on record . 15 years out of the 17 broke the 1995 record. The probability for such a run of temperature extremes assuming no warming trend is an epsilon (5*10^-18) which for all practical purposes is zero. In other words, there are five chances in a billion billion that the Earth is not experiencing a warming trend. The Earth is not getting cooler.
We also assign 2.5 percent probability to the aggregate of denier hypotheses that the Earth is not warming and 2.5 percent probability to the aggregate of denier hypotheses that the earth is warming but human emissions have little or nothing to do with it. This is still approximate but because these are clearly contradictory they are independent hypotheses. It would be useful to pin down deniers and commit them to a set of specific testable hypotheses and disallow the “anything but the IPCC” opinion. But most deniers don’t know what they believe, excepting those who most confidently believe in the “invisible hand” or a deity, i.e., the zero probability hypotheses.
In Table 4 our results are summarized. The first column reproduces Silver’s application of Bayes Rule to global warming. The second column corrects Silver’s assumption but still uses Silver’s methodology. The third column corrects both Silver’s assumptions and the methodology. All of these results are applicable to the information Silver had access to. The third column is what Silver should have presented in his book, more signal, less noise. The fourth column is new Bayesian analysis applied to the 2013 IPCC report and the most recent science and shows how we should update our belief in the probability human-caused global warming using the best available data.
The Cowtan/May Data thus increases the probability that our hypothesis is true from 98.7 percent to 99.7 percent. Recall again that the IPCC report sets the probability between 95 percent and 100 percent so this is not surprising. The probability that the earth is not warming at all drops out of sight as it should and the remaining hypotheses that the earth is warming but not by human causes reduce to 0.3 percent.
Silver is the closest thing to a celebrity information theory has. He has an obligation to be a better interpreter of the science. Though he fails in this, I still like his book and recommend it. But if you only have time to read one book on Bayesian theory read McGrayne’s. It is much the better.
Silver does not mention the strongest part of the consensus argument which is the paleoclimate record. This is a remarkable omission when one considers that Silver discusses the climate scientist James Hansen at length. Hansen has consistently argued this point, that the paleoclimate record and not the modeling is the most important evidence for the human-caused global warming hypothesis . Reading Silver, one can get the impression our knowledge of climate comes only from modeling.
The human-caused global warming hypothesis is based on experimental evidence, observation and a record of Earth climates in the past as well as some elegant analysis and knowledge of physical laws. We really do not need any modeling to tell us that global warming is true, though it helps to confirm our physical understanding. We need good models to estimate how warm the future climate is going to be regionally and how soon the bad stuff will begin to occur. We desperately need to know which areas are going to flood and which will suffer severe drought. To do this climate modelers have built increasingly complex models and applied them to past climate events. By definition what has happened in the past includes all of the unknown feedbacks and forcing functions, because it is actually what happened. The models are used to “hindcast” the past. They are continually improved so that they more accurately mimic reality. The models do a fairly good job of this. They are then trained on the future. Of course the models are simply awesome and the scientists who do this work are exceptional. If the models have one big flaw it is that they are too optimistic. They do not hindcast tipping points or non-linear events very well. The salient point is simply this: deniers can attack the models all they want but they cannot change the reality of physical laws one iota by doing so.
We have to get past questioning the IPCC reports. The science is rigorous and beyond reproach. We need to focus our skepticism on the denier arguments. If any of it has legs, if there is any denier signal at all, we have to get past all the nonsense and that cannot happen unless we swipe the board clean of the stupidity. If there is any merit to any denier argument then that argument will survive the purge.
But here is the thing. Let’s pretend we did take the consensus hypothesis seriously and did cut back on our carbon emissions. What would be the downside? I don’t really think there is any credible evidence that there is a downside. The impact on the economy certainly would be less than that of the Iraq War or the Gramm Leach, Bliley Act or the Commodities Futures Modernization Act resulting in the 2008 Great Contraction. What if we continued business as usual and failed to question laissez-faire free market economics? What is the downside of that? Here is the reality check. At no time in the paleoclimate history have we so far discovered a climate event which changed at even 10 percent of the velocity of the current human-caused climate change excepting asteroid collisions (a recent paper suggests the PETM climate change may have rivaled the Anthropocene). Even the transition from the last glacial maximum to the Holocene optimum had a velocity of about 1 percent of the current rate of change. Every high velocity event (those that were within 10 percent of current velocity) resulted in an extinction event. What do we think the probability of a human caused extinction event is if we are changing the climate 10 times faster than the fastest events caused by nature and adding additional stress to the biosphere such as overfishing and over fertilizing? What do we think Homo sapiens chances are of surviving the event we are causing? We have to get real. We need to focus our skepticism on the noise we take for granted which needs much more serious scrutiny. I would suggest we start with laissez-faire free market capitalism. This is an ideology based on a flawed model of human behavior which also happens to ignore the laws of physics. What better place to start?
A few people are aware of the feud between the behavioral economists (including the neo-Keynesians) on the one hand and the Chicagoans since both Daniel Kahneman (co-founder of the Behavioral school) and Eugene Fama (inventor of the efficient market) have won Nobels. In brief, Chicagoans ignore the reality of irrational human behavior which is irrational itself. Hardly anybody is aware of the gap between the reality-based biophysical school of economics and the belief-based Chicagoans. And that is a shame. As Armstrong says “I actually try not to learn a lot about climate change.” Reality interferes with his forecasting.
One of the topics Silver gets right in my view is his discussion of Big Data. This may be my next topic. I will be citing Nate Silver but in a good way. He and I do agree on some things. :+)
Figure 1 – Land, atmosphere, and ice heating (red), 0-700 meter OHC increase (light blue), 700-2,000 meter OHC increase (dark blue). From Nuccitelli (2012).
Figure 2 from 
Figure 3 revised global temperature using satellite data. The corrected data (bold lines) are shown in the graph compared to the uncorrected ones (thin lines). The temperatures of the last three years have become a little warmer, the year 1998 a little cooler.
Figure 4. In this handout from the Malacanang Photo Bureau, an aerial view of buildings destroyed in the aftermath of Typhoon Haiyan on November 10, 2013 over the Leyte province. (Getty/Malacanang Photo Bureau)
Figure 5 Annual average GCR counts per minute (blue – note that numbers decrease going up the left vertical axis, because lower GCRs should mean higher temperatures) from the Neutron Monitor Database vs. annual average global surface temperature (red, right vertical axis) from NOAA NCDC, both with second order polynomial fits. http://www.skepticalscience.com/cosmic-rays-cosmically-behind-humans-explaining-global-warming.html
Figure 6. Solar irradiance in the era of accurate satellite data. Left scale is the energy passing through an area perpendicular to Sun-Earth line. Averaged over Earth’s surface the absorbed solar energy is ~240 W/m2, so the amplitude of solar variability is a forcing of ~0.25 W/m2. (Credit: NASA/GISS, my source James Hansen)
 Hilbert, Big Data for Dev.; pre-published version, Jan. 2013.
 Daron Acemoglu, Asuman Ozdaglar, and Ali ParandehGheibi, Spread of Misinformation in Social Networks, arXiv:0906.5007v1 [cs.IT] 26 Jun 2009.
Stephan Lewandowsky, Ullrich K. H. Ecker, Colleen M. Seifert, Norbert Schwarz, and John Cook, Misinformation and Its Correction: Continued Influence and Successful Debiasing, Association for Psychological Sciences, DOI: 10.1177/1529100612451018.
 Kahan, Dan M. and Peters, Ellen and Dawson, Erica Cantrell and Slovic, Paul, Motivated Numeracy and Enlightened Self-Government (September 3, 2013). Yale Law School, Public Law Working Paper No. 307. Available at SSRN: http://ssrn.com/abstract=2319992 or http://dx.doi.org/10.2139/ssrn.2319992
 Sharon Bertsch McGrayne, The Theory That Would Not Die: How Bayes’ Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy, Yale University Press, 2012.
 Nate Silver, The Signal and the Noise, why so many predictions fail – but some do not, The Penguin Press, 2012.
 Fazlollah M. Reza, An Introduction to Information Theory, McGraw-Hill, 1961.
 David MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge University Press, 2003. http://www.inference.phy.cam.ac.uk/mackay/itila/
 MacKay’s lecture series on inference http://videolectures.net/mackay_course_01/
 David MacKay, sustainable energy – without the hot air, UIT Cambridge, 2009.
 Tony Noerpel, Global Warming Discovered – Again, April 25, 2012, http://brleader.com/?p=8276 see also Rohde R, Muller RA, Jacobsen R, Muller E, Perlmutter S, et al. (2012) A New Estimate of the Average Earth Surface Land Temperature Spanning 1753 to 2011. Geoinfor Geostat: An Overview 1:1.
 Tony Noerpel, Arctic Amplification, April 17, 2012, http://brleader.com/?p=8136
 Kevin Cowtan and Robert G. Way, Coverage bias in the HadCRUT4 temperature series and its impact on recent temperature trends, Quarterly Journal of the Royal Meteorological Society, 2013.
 Most economists who did predict the great contraction such as Nouriel Rubini and Steve Keen are selling their newsletters and investment advice for a considerable premium. Others like Dean Baker and Robert Shiller may be doing the same for all I know but at least there is consider able information on their exploits all over the web because the economist who got it right is such a rare beast. I could find no reference to Armstrong having gotten this right. What I could find were references to that fact that nobody at the Wharton school “forecast” the great contraction or simply credit crisis. http://www.forbes.com/2009/11/18/behavorial-economics-indicators-entrepreneurs-finance-wharton.html and http://www.ifw-members.ifw-kiel.de/publications/the-financial-crisis-and-the-systemic-failure-of-academic-economics/KWP_1489_ColanderetalFinancial%20Crisis.pdf
 Michael Mann, The Hockey Stick and the Climate Wars, dispatches from the front lines, Columbia, 2012.
 Tony Noerpel, The Con in Economics, http://brleader.com/?p=10734 March 5, 2013
 Paul Samuelson, Theory and Realism: A Reply, American Economic Review, vol. 54, September 1964, pp736-9.
 Yuko Heath and Robert Gifford, Free-Market Ideology and Environmental Degradation: The Case of Belief in Global Climate Change, DOI: 10.1177/0013916505277998, Environment and Behavior 2006 38: 48
 Laplace, the first scientist to apply Bayes analysis, famously remarked when asked by Napoleon why he left out mention of the creator in his book on celestial mechanics: “I had no need of that hypothesis.”
 I’d say cosmic ray hypothesis is dead in the water. Here are some recent scientific papers which discuss this.
Rasmus E Benestad, Are there persistent physical atmospheric responses to galactic cosmic rays?, Environ. Res. Lett. 8 (2013) 035049 (7pp) doi:10.1088/1748-9326/8/3/035049
T. Sloan, and A.W. Wolfendale, “Cosmic rays, solar activity and the climate”, Environmental Research Letters, vol. 8, pp. 045022, 2013. http://dx.doi.org/10.1088/1748-9326/8/4/045022
J. Krissansen-Totton, and R. Davies, “Investigation of cosmic ray-cloud connections using MISR”, Geophysical Research Letters, vol. 40, pp. 5240-5245, 2013. http://dx.doi.org/10.1002/grl.50996
J. Almeida, et al., “Molecular understanding of sulphuric acid–amine particle nucleation in the atmosphere”, Nature, vol. 502, pp. 359-363, 2013. http://dx.doi.org/10.1038/nature12663
G.J. van Oldenborgh, A.T.J. de Laat, J. Luterbacher, W.J. Ingram, and T.J. Osborn, “Claim of solar influence is on thin ice: are 11-year cycle solar minima associated with severe winters in Europe?”, Environmental Research Letters, vol. 8, pp. 024014, 2013. http://dx.doi.org/10.1088/1748-9326/8/2/024014
 NASA/GISS, my source James Hansen
 extreme value theory http://en.wikipedia.org/wiki/Extreme_value_theory
 H. Abarbane, S. Koonin, H. Levine, G. MacDonald, O. Rothaus, Statistics of Extreme Events
with Application to Climate, Jason Study Report, JSR-90-30S, MITRE Corporation, January 1992. http://www.fas.org/irp/agency/dod/jason/statistics.pdf
 James Hansen, Makiko Sato, Gary Russell and Pushker Kharecha, Climate sensitivity, sea level and atmospheric carbon dioxide, Phil. Trans. R. Soc. A 2013 371, 20120294, published 16 September 2013.