25 July 2014

Sack the Science Advisor

So you don't like the advice that your expert advisor is giving? So then how about sacking the advisor? Even better yet, how about getting rid of the entire advisory mechanism?

That is the advice that Greenpeace and other NGOs are giving to European Commission president-elect Jean-Claude Junker. If it sounds a bit like something out of the Richard Nixon playbook, well, it is.

I defend the EU science advisor and the broader structure over at the Guardian. Read it here.

For a deeper dive into science advice in government, see this volume by James Wilsdon and Rob Doubleday. My chapter in that volume on the role of the science advisor can also be found here in PDF.

22 July 2014

A New Paper on Disaster Losses and Climate Change

A new paper appeared in Climatic Change this week by Visser et al. which looks at disasters and climate change (open access here).  Like other studies and the IPCC assessment, Visser et al. find no trends in normalized disaster loses, looking at several metrics of economic and human losses.

They conclude:
The absence of trends in normalized disaster burden indicators appears to be largely consistent with the absence of trends in extreme weather events. This conclusion is more qualitative for the number of people killed. As a consequence, vulnerability is also largely stable over the period of analysis.
The top line conclusion here is not surprising, though it is interesting because it uses independent methods on largely independent data. It is consistent with previous data and analyses (e.g., Bouwer 2011, Neumayer and Bartel 2011, Mohleji and Pielke 2014) as well as with the conclusions of the recent IPCC assessments (SREX and AR5).

What is perhaps most interesting about this new paper is their discussion of vulnerability. Some have argued that our methodological inability to fully account for possible changes in vulnerability to losses over time may mask a climate change signal in the data. (It's gotta be there somewhere!) This line of argument has always been suspect, because there are not relevant trends in phenomena such as floods and hurricanes which would lead to an expectation of increasing normalized losses.

Visser et al. take this issue on and offer several explanations as to why vulnerability does not mask any hidden signals:
Firstly, global disaster management initiatives have only recently been put in place. The Hyogo Framework for Action (HFA) was adopted by 168 Member States of the United Nations in 2005 to take action to reduce vulnerabilities and risks to disasters (UNISDR, 2011). Although these highly important efforts will certainly pay off in the near future, it is unclear whether they are reflected in the sample period chosen for this study. Similar conclusions are drawn in IPCC (2014). . .

Secondly, it is unclear to what extent adaptation measures work in practice. Heffernan (2012) argues that many countries, and even the richest, are ill-prepared for weather extremes. As an example, he names Hurricane Sandy, which wreaked a loss of 50 billion USD along the northeast coast of the US in 2012. As for early warning systems, Heffernan states that not all systems are functioning well. For example, in 2000, Mozambique was hit by a flood worse than any in its history, and the event was not at all anticipated. Warnings of above-average rainfall came too late and failed to convey the magnitude of the coming flood.

Thirdly, a positive trend in vulnerability may be offset by the increasing number of people moving from rural to urban environments, often situated in at-risk areas (UN 2012). Since many large cities lie along coastlines, these movements will make people more vulnerable to land-falling hurricanes (Pielke et al. 2008), coastal flooding and heatwaves (due the urban heat island effect). With regard to economic losses, Hallegatte (2011) argues that these migration movements may have caused disaster losses to grow faster than wealth.

Fourthly, it is unclear how political tensions and violent conflicts have evolved over large regional scales since 1980. On the one hand, Theisen et al. (2013) show that the number of armed conflicts and the number of battle deaths have decreased slightly at the global scale since 1980. On the other hand, these methods are rather crude as far as covering all aspects of political tensions are concerned (Leaning and Guha-Sapir et al. 2013).

We conclude that quantitative information on time-varying vulnerability patterns is lacking. More qualitatively, we judge that a stable vulnerability V t, as derived in this study, is not in contrast with estimates in the literature.
In short, those who claim that a signal of human caused-climate change is somehow hidden in the disaster loss record are engaging in a bit of unjustified wishful thinking. The data and evidence says otherwise.

The bottom line? Once again, we see further reinforcement for the conclusion that there is no detectable evidence of a role for human-caused climate change in increasing disaster losses. In plain English: Disaster losses have been increasing, but it is not due to climate change.

21 July 2014

New Gig

If you are interested in my writings on sports-related matters, you can now find me over at Sporting Intelligence, run by the brilliant Nick Harris.

I have recently completed an evaluation of World Cup predictions and just today have a piece up on doping in sport.

Comments and suggestions always welcomed.

17 July 2014

Guest Post: Kerry Emanuel Clarifies a Recent Quote in the NYT

The comment below is by Kerry Emanuel, at MIT, who is clarifying a recent quote of his in the New York Times.
I would like first to thank Roger for allowing me to post this response to the article about John Christy by Michael Wines in Tuesday's New York Times. Although I was quoted accurately, the context in which the quotation was phrased distorted its intended meaning.

Several weeks ago, I had several phone conversations with Mr. Wines about the work of John Christy. In those conversations, I emphasized the value of skepticism in science and also said that I agreed with some elements of John's point of view, in particular, that projections are still highly uncertain, that climate models leave a great deal to be desired, and that some of the decisions that have to be made about how to deal with climate change are very tough indeed. Wines asked me to explain where I differ from John. I told him that we differ primarily in our assessment of the magnitude of climate tail risk. Wines asked me to explain what I meant by "tail risk", and I offered the metaphor of advising a small girl whether she should cross a busy street to catch her bus (a metaphor I have used before).

Unfortunately, the positioning of the quotation within the article makes it seem as though I am suggesting that John is the kind if person who would let the girl take the risk. I state here that I have absolutely no reason to question John's motives; indeed, he strikes me as the sort of person who would risk his own life to save a child who wandered into a busy street. My metaphor was intended only to illustrate the nature of tail risk.

16 July 2014

Updated: Global Weather Disasters and Global GDP

Munich Re has just released their tabulation of disaster losses for the first half of 2014. I thought I'd use the occasion to update the dataset shown above. The graph above shows global weather disasters as a proportion of global GDP. Note that 2014 represents January-June. I assume that the first half of 2014 global GDP is 2.5% higher than 2013. I also assume that total 2014 losses to date are all due to weather. Both assumptions err on the conservative side of things. Enjoy!

Data: Munich Re and UN

15 July 2014

Updated Normalized Disaster Losses in Australia: 1966-2013

The graph above is the most recent update of the normalized disaster loss database for Australia, sent to me by Ryan Crompton of Risk Frontiers at Macquarie University. Ryan also sends this explanatory text:
Crompton and McAneney (2008) normalised Australian weather-related insured losses over the period 1967-2006 to 2006 values. Their methodology adjusted for changes in dwelling numbers and values (excluding land value) and in a marked point of departure from previous normalisation studies, they applied an additional adjustment for tropical cyclone losses to account for improvements in construction standards mandated for new construction in tropical cyclone-prone parts of the country. These were introduced around the early 1980s following lessons learnt from the destruction of Darwin by Tropical Cyclone Tracy in 1974 (Mason et al. 2013).

Crompton and McAneney (2008) emphasise the success of improved building standards in reducing building vulnerability and thus tropical cyclone wind-induced losses. Figures 1a and b show the annual aggregate losses and the annual aggregate normalised losses (2011/12 values) for weather-related disasters. These figures are updated from Crompton and McAneney (2008) using a refined methodology described in Crompton (2011).

Crompton, R. P., 2011. Normalising the Insurance Council of Australia Natural Disaster Event List: 1967-2011. Report prepared for the Insurance Council of Australia, Risk Frontiers.(PDF)

Crompton, R. P., and K. J. McAneney, 2008. Normalised Australian insured losses from meteorological hazards: 1967-2006. Environ. Sci. Policy 11: 371-378. (PDF)

Mason, M., K. Haynes, and G. Walker, 2013. Cyclone Tracy and the road to improving wind resistant design. In Boulter, S., J. Palutikof, D. J. Karoly, and D. Guitart (eds.), Natural disasters and adaptation to climate change. Cambridge University Press.
Note that the 2012 and 2013 values (shown in yellow in the bottom graph) have not been normalized back to 2011/2012 values. They are shown as reported. Once normalized they will be a bit lower, so as presented they overestimate them 2012 and 2013 losses, but not by a large amount.

10 July 2014

Common Ground on Climate

The warring tribes in the climate wars appear to have found something they can agree on. Unfortunately, that agreement is that the Kaya Identity is stupid.

At WattsUpWithThat they take issue with a new UN report (here in PDF) which utilizes the Kaya Identity:
[T]heir goofy equation is known as the “Kaya Identity“. Apparently, the number of innumerate people on the planet is larger than I had feared.
And of course recently Paul Krugman also took issue with the Kaya Identity while putting words into my mouth:
This is actually kind of wonderful, in a bang-your-head-on-the-table sort of way. Pielke isn’t claiming that it’s hard in practice to limit emissions without halting economic growth, he’s arguing that it’s logically impossible. So let’s talk about why this is stupid.
On Twitter @FabiusMaximus01 sums it up perfectly:

07 July 2014

The Decoupling of Food and Land

The graphs above come from this post at The Breakthrough Institute by Jon Fisher of The Nature Conservancy. The graphs show something profound in global agriculture: the world is producing more food per person on less land. While there are caveats and details that are important, overall these twin trends are good news.  Do head over to @TheBTI and read Fisher's excellent discussion.

18 June 2014

Increasing Carbon Intensity of Global Energy Consumption

I have been continuing to look at the BP Statistical Review of World Energy 2014, which was released earlier this week. It is a wonderful resource, kudos to BP.

The graph above shows the carbon intensity of global energy consumption from 1965 to 2013. Specifically, it shows the amount of carbon emissions (in tons) for every "ton of oil equivalent" consumed in the global economy. Thus, the consumption data includes both carbon intensive sources of energy (coal, gas, oil) and also the less carbon intensive sources (hydro, wind, solar, nuclear, etc.).

The graph shows that global energy consumption decarbonized at a remarkably steady rate from 1965 to the late 1990s. Since then, global energy consumption has become slightly more carbon intensive. In 2013 the carbon intensity of global energy consumption was just about the same as it was in 1991. Since 1999, this metric of carbon intensity has increased by 1.5%. The graph indicates that in the 21st century, whatever gains are being made by low carbon energy technologies, they continue to be equaled or even outpaced by continuing gains in fossil fuels.

To place this analysis in perspective: Cutting global carbon dioxide emissions by 50% (just to pick a round number) while increasing global energy consumption by 50% (another round number) implies a carbon intensity of 0.25 tons carbon per ton of oil equivalent.

For those wanting to explore a little deeper into why this analysis matters for how we might think about climate policies, have a look at this paper in PDF.

16 June 2014

Treading Water

The graph above shows data from the BP Statistical Review of World Energy 2014, which was released today. It shows the proportion of global energy consumption that comes from carbon-free sources.

The proportion of carbon-free energy consumption is a far more important metric of progress with respect to the challenge of stabilizing carbon dioxide levels in the atmosphere than looking at carbon dioxide emissions. The reason for this is that emissions are a consequence of energy consumption, and the way that we influence emissions is through energy technologies and their use in the economy. So looking directly at energy consumption is a much more direct and relevant way to understand the technological challenge of emissions reductions. From a policy perspective, looking solely at emissions can easily deceive.

In 2013 the proportion of carbon-free energy consumption was just about 13%, representing a continuation of no trend in that measure that has continued for more than 20 years. The measure did tick up from 2012 - from 13.1% to 13.3%, to just about equal to what it was in 1999.

To stabilize atmospheric concentrations of carbon dioxide requires that this proportion exceed 90%, independent of how much energy the world ultimately consumes. But don't take my word for it, do the math yourself. The timing of exceeding that 90% threshold will determine the atmospheric concentration level at which stabilization ultimately occurs.

If the increase in the carbon-free proportion from 2012 to 2013 (of 0.17%) is taken as a trend going forward, then the 90% threshold will be exceeded in the year 2465. Fortunately, linear projections of most anything related to future energy are wrong.

What you should take from this however is that there remains no evidence of an increase in the proportion of carbon-free energy consumption even remotely consistent with the challenge of atmospheric stabilization of atmospheric carbon dioxide. Those who claim that the world has turned a corner, soon will, or that they know what steps will get us around that corner are dreamers or fools. We don't know. The sooner we accept that, the sooner we can design policies more compatible with policy learning and muddling through.