31 May 2011

Donald Boudreaux: I'll Take That Bet

Writing in the WSJ last week economist Donald Boudreaux of George Mason University offers to make a bet in order to make a point about human-caused climate change in response to Bill McKibben's silly essay from earlier in the week in the Washington Post:
I'll bet $10,000 that the average annual number of Americans killed by tornadoes, floods and hurricanes will fall over the next 20 years. Specifically, I'll bet that the average annual number of Americans killed by these violent weather events from 2011 through 2030 will be lower than it was from 1991 through 2010.
I am willing to take this bet in order to raise awareness of the fact that both sides of the debate over climate change debate can't see the forest for the trees.  The factors that will drive loss of human life due to weather extremes in coming decades will be increasing vulnerability and exposure.

As a condition of the bet, when I win (which unfortunately will occur long before 2030) I ask that the proceeds go directly to the American Red Cross. (Should I lose the bet come 2030, I'll make out a check to the charity of Prof. Boudreaux's choice.)  A second condition is that Prof. Boudreaux agrees to write an op-ed for the WSJ (or some other venue) explaining the bet and why he lost (of course, I am willing to do the same).

Here are the technical terms of the bet that I will accept.  Prof. Boudreaux refers to three hazards: floods, tornadoes and hurricanes.  The dataset that he proposes using is the official record kept by the US National Weather Service (available here in PDF through 2009, and here in PDF for 2010).  This leads to the following summaries for the base period of 1991-2010 proposed by Prof. Boudreaux:


Tornado Flood Hurricane
1991 39 61 19
1992 39 62 27
1993 33 103 2
1994 69 91 9
1995 30 80 17
1996 26 131 37
1997 67 118 1
1998 130 136 9
1999 94 68 19
2000 41 38 0
2001 40 48 24
2002 55 49 53
2003 54 86 14
2004 34 82 34
2005 38 43 1016
2006 67 76 0
2007 81 87 1
2008 126 82 12
2009 21 53 2
2010 45 103 0

1129 1597 1296 4022

In his WSJ column Professor Boudreaux asks his readers to subtract the deaths from Hurricane Katrina because they were the result of a levee break.  It is not clear whether that was just a rhetorical move for the column or if that calculus also extends to the bet. He can clarify that for me in his response. With Katrina the total is 4,022 deaths and without Katrina the total is about 3,000 (again, Prof. Boudreax can tell me which number he prefers).

So far 2011 has seen 518 deaths from tornadoes.  This means that from today through 2030 the United States could see only 3,500 additional extreme weather deaths, or 180 per year (using the higher baseline that includes Katrina deaths, or 154 per year using the lower number of 3,000).  Such numbers would represent an improvement over 1991-2010, and Prof. Boudreax would still lose the bet.  We should be so lucky, and it would take a lot of luck, to see so few deaths due to extreme weather.

The fact of the matter is that our vulnerability to extreme weather is increasing, due to a combination of a growing population and especially urbanization in locations prone to extreme weather events.  This means that even with the hard work by many professionals in a range of fields that has contributed to the dramatic decrease in the number of deaths over recent decades low death totals are unlikely to continue into the future, as this year's tragic tornado season tells us.  Of course, given expected societal trends a reversal in statistics would not necessarily mean that our disaster policies are failing.  What it means is that our responses to extreme weather require constant vigilance, investment and continued hard work.

In trying to score points in the debate over global warming Professor Boudreax misses what really matters most on this issue.  And that is why my response to his question, "Do I have any takers?" is "Yes."

I will email Prof. Boudreaux with this post and update with his response.

Continued Deceleration of the Decarbonization of the Global Economy

Last summer I noted a distinct trend since 1990 of a deceleration of the decarbonization of the global economy.  What does this mean in plain English?  It means that the the trend of emitting less carbon per unit of economic activity -- in place for much of the 20th century -- was slowing down.  This slow down was occurring despite intentions expressed in policies around the world to accelerate the trend, despite assumptions in virtually every major integrated assessment model that the trend would begin to accelerate even in the absence of such policies and despite the rapid deployment of renewable energy around the world.

New data has just been released which allows for an update of the data that I presented last summer through 2010, and the update can be seen in the graph above. The data shows that in 2010 the world saw the rate of change in its carbon dioxide emissions per unit of economic activity continue to decrease -- to zero.  (The data that I use are global GDP data from Angus Maddison extended using IMF global GDP growth rates and NEAA carbon dioxide data extended to 2010 using the 2010 growth rate released by the IEA yesterday). 

The deceleration of the decarbonization of the global economy means that the world is moving away from stabilization of concentrations of carbon dioxide in the atmosphere, and despite the various reports issued and assertions made, there is no evidence to support claims to the contrary.  For more on why this is so, I recommend this book.

27 May 2011

Dan Sarewitz on Senator Coburn's New Report

[EDITOR'S NOTE: This is a guest post by Dan Sarewitz, professor at ASU and co-director of the Consortium for Science, Policy and Outcomes (and pictured below). To his surprise, Dan found his work cited approvingly in a new report on the National Science Foundation just released by Senator Tom Coburn (R-OK). This post has Dan's reaction.  David Bruggeman has more here.]

Senator Tom Coburn (R-OK), who has a reputation as a straight-shooting, no-nonsense conservative, is also, it appears a supporter of national industrial policy, something that conservatives typically hate. He has just issued a report that alleges to be a hard-hitting critique of the National Science Foundation, but it’s mostly just an attack on government funding of social science research, thus continuing a conservative tradition that dates back to the debates over the initial creation of NSF in the late 1940s.

I’ll get to the social science stuff in a minute, but for now let’s focus on the fact that Senator Coburn prominently—and apparently approvingly—quotes, um, ME! (Please see page 12 of his report.) In a recent Nature column (among lots of other places) I argue that US civilian R&D agencies are not appropriately structured to catalyze technological innovation or progress rapidly toward desired societal outcomes, and that this institutional weakness remains significantly camouflaged by the legacy of DOD and the military-industrial-university complex,which powered technological innovation and economic growth in the decades following World War II.

I’m pleased that Senator Coburn finds this critique to be compelling, and can only infer, then, than he would agree that what’s needed is a much more coherent and strategic approach for linking knowledge creation to knowledge use and problem solving—a strategy that, in the olden days might have been called “industrial policy” and now we might term “innovation system policy.” It’s only slightly ironic, I guess, that the (still admittedly limited) understanding we have of how innovation systems work—the basis of my critique that he so flatteringly cites—is, well, rooted in the social sciences that he wants to de-fund.

(On this latter point, in part the Senator’s report is just another example of Republicans using the banner of fiscal responsibility to attack programs that they happen not to like but whose elimination can have no conceivable impact on fiscal responsibility. The entire social and behavior science budget at NSF ($252 million) amounts to all of 3.6% of the total NSF budget, 0.3% of the civilian R&D budget, and 0.006% of the federal budget. Attacking social science is good conservative politics, but it has nothing to do with serious budget policy.)

Moreover, much of Senator Coburn’s report details the sorts of random, petty abuses that are simply unavoidable in any complex bureaucracy like NSF (my goodness, an NSF employee was caught watching lots of porn! And another one scheduled a work trip so he could visit his girlfriend!). Yet the report does touch on a problematic aspect of civilian science policy that has managed to escape serious political scrutiny for 60 years, even though it is fundamentally incoherent. In specific, Senator Coburn is concerned that NSF’s research is insufficiently “transformative.” He cites survey work (more social science!!) showing that most NSF peer reviewers believe that only a small percentage of the proposals they review are “transformative.” He then goes on to list fifty or so examples of funded projects (“Are people more or less racially-focused when seeking love on-line in the Obama era?”) whose potential “transformativeness” he questions. This approach follows the tradition of Senator William Proxmire’s Golden Fleece Awards of the 1970s and 1980s, and it was probably fun to do. But could it be a coincidence that all of the projects he singles out have titles that a lay person can understand, and that many of them are social science projects? Or have Senator Coburn and his staff determined that all of the work that NSF funds in subatomic particle physics, deep mantle geochemistry, and molecular genetics is genuinely “transformative”?

The political rhetoric of basic academic science for the past 50 years has been basically this: leave us alone to follow our curiosity wherever it may lead, and the payback to society will be enormous—we’ll cure cancer, create the next industrial revolutions, clean up the environment, and everyone will get wealthier in the process. And if you try to tell us what to do you’ll only screw things up.

But as budgets have increased over the decades, as science and politics have increasingly come into conflict, and as the promised benefits of science have often proven elusive indeed, the question of how this path from knowledge to benefit really works becomes hard to avoid. Science advocates up the ante by promising more “transformative” research, but of course science is only one element of a complex set of factors that lead to progress on difficult social problems. When frustration sets in, what’s a politician to do?

I’m a big fan of (and have been generously supported by) NSF (including support for work that has informed my own critiques of the policy model that sustains NSF), but have long feared that the simpleminded elegance of the political rhetoric of academic basic research will someday turn out to be a source of serious political vulnerability for publicly supported science. Is Senator Coburn just being a heel, or has he discovered American science’s Achilles’ Heel?

26 May 2011

Capsule Reviews of Three New Studies of Innovation

Three news studies of the benefits of innovation policy have crossed my desk in the past few days.  In general I have problems with such studies, and more accurately how they are used, because they often reduce innovation to a pipeline metaphor.  The pipeline has federal dollars inserted in one end and -- after some magic occurs inside the pipeline -- from the other emerge the fruits of innovation, such as computer technologies, medicine, green energy and so on.

To understand innovation requires getting a handle on the processes that translate investments into desired outcomes, which inevitably is much more complicated than any pipeline. 

The first study is from McKinsey (PDF) and is on the role of the internet in the global economy.  The report has some fascinating data on the role of the internet in 13 different countries around the world, but its most important contribution is its assessment of those factors that contribute to economic growth via the internet.  Among these are human capital, infrastructure, a favorable business environment and financial capital.  Governments have a role to play in each of these areas.

The second study comes from a group called United for Medical Research (PDF) and is focused on evaluating the economic consequences of government spending through the National Institutes of Health.  This study is much more a "pipeline" study with a focus on funding to NIH and various correlates of economic consequences with only a bit of hand-waving at mechanisms of innovation.

The third report comes from Batelle (PDF) and is focused on the economic and functional consequences of the Human Genome Project.  While the report has much valuable information about technologies, processes and outcomes related to the HGP, it is thin (remarkably so) on the public and private context which enabled such outcomes, lending itself to a "pipeline" interpretation.

It seems to me that the McKinsey report is far more focused on providing information that might be useful in thinking about innovation policies, whereas the UMR and Batelle studies seem aimed at providing simple justifications for more federal funding for R&D.  Innovation policies however are far more than money spent on R&D.

25 May 2011

Word Cloud of President Obama's Speech to UK Parliament

Full text here.
The Guardian does a similar analysis and compares the text to speeches to the UK Parliament by Presidents Clinton (1995) and Reagan (1982):

Our Ideological Lenses

According to an opinion poll taken last week, 57% of the French public and 70% who identify as supporters of the Socialist party believe that Dominque Strauss-Kahn's arrest in New York was the result of a conspiracy to set him up. 

It is information like this that leads me to look at data on public understanding of science -- with all of its revelations about how little the public knows compared to the relevant experts -- as a glass half full not half empty.  In fact, it is somewhat remarkable that most of the public often (but not always) gets most issues mostly right.

(H/T The Monkey Cage)

24 May 2011

In the Books

Congrats to Paige St. John, Sarasota Herald-Tribune

Great journalism can still be found ... Congrats to Paige St. John of the Sarasota Herald-Tribune for receiving the 2011 Pulitzer Prize for Investigative Reporting for her series on insurance and reinsurance in Florida.  The series can be viewed here.

When the series first came out I blogged on it here and here and here and here.

Treatment of Bushfires by the Australian Climate Commission

EDITOR'S NOTE: This is a guest post by Ryan Crompton (pictured below) of Macquarie University's Risk Frontiers.

If you are following events here in Australia, you may be aware that the Climate Commission released its first report yesterday. Entitled The Critical Decade, it aims to “provide up-to-date information on the science of climate change and the implications of this knowledge for societal responses, both for mitigation strategies and for the analysis of and responses to risks that climate change poses for Australia.”

Roger’s blog post yesterday on Australian Bushfire Damage and Climate Change is a bit of a giveaway as to where my main interest lies in the report, the section on ‘Extreme events’. In this post, I will focus only on the sub-section dedicated to ‘Bushfire intensity and frequency’.

My main issue is the report’s use of a key reference, the study by Cai et al. (2009c, full citation below) entitled “Positive Indian Ocean dipole events precondition southeast Australia bushfires”, to support the statement that
“the intensity and seasonality of large bushfires in southeast Australia appears to be changing, with climate change a possible contributing factor”
While I have no issue with the Cai et al. study itself (we cited this in our recent bushfire paper), at best, the use of it in the Commission’s report is clumsy, and at worst, misleading.

The first thing that has been lost in translation (some may argue I am splitting hairs here) is while the Cai et al. paper has the words ‘southeast Australia bushfires’ in its title, it is actually focused exclusively on Victorian bushfires. Thus, it does not fully support the statement above, nor does it discuss the 2003 Canberra bushfires.

More important is that the definition used in Cai et al. to categorise historical Victorian bushfire seasons (1950-2009) as ‘significant’ incorporates historical impacts (fatalities, property, livestock losses) rather than meteorological variables. If the purpose of the bushfire sub-section of the Commission’s report is to discuss the bushfire hazard, which I believe to be the case based on my reading of other Extreme events sub-sections, then it makes no sense to refer to a paper that incorporates impacts.

In other words, it is nonsensical to report on possible changes in bushfire hazard intensity using a measure of bushfire intensity that incorporates impacts.

If instead the statement in the report was referring to impacts, then why was the conclusion from our research not cited here – that there is no discernable evidence that the normalized Australian bushfire building damage (1925-2009) is being influenced by climate change due to the emission of greenhouse gases (PDF)?

While it may not materially change the results of Cai et al., it is also worth mentioning that the impacts used to classify historically significant Victorian summer bushfire seasons were not normalized.

Lastly, the Cai et al. conclusions relating to climate change focus on the possibility of bushfire risk increasing in the future and do not relate to the influence climate change has had to date.

In sum, as far as the bushfire sub-section of the Climate Commission’s report is concerned, it seems that both accuracy and clarity have been sacrificed for economy. And that, unfortunately, will always do far more harm than good.

References

Cai, W., T. Cowan, and M. Raupach, 2009c: Positive Indian Ocean dipole events precondition southeast Australia bushfires. Geophys. Res. Lett., 36, L19710, doi:10.1029/2009GL039902.

Crompton, R. P., K. J. McAneney, K. Chen, R. A. Pielke Jr., and K. Haynes, 2010. Influence of Location, Population and Climate on Building Damage and Fatalities due to Australian Bushfire: 1925-2009. Weather, Climate, and Society, 2:300-310.

23 May 2011

Peer-Reviewed Exchange on Australian Bushfire Damage and Climate Change

Earlier this year, I was a co-author on a paper in the AMS journal Weather, Climate and Society on trends in normalized property damage from Australia’s bushfires (the lead author is Ryan Crompton, who has led some outstanding work of late).  In that paper we concluded that:
[T]here is no discernable evidence that the normalized data are being influenced by climate change due to the emission of greenhouse gases.
Neville Nicholls has authored a response, and then in return, we have written a rejoinder.  This peer-reviewed exchange has just been published in Weather, Climate and Society. This post provides a concise summary of the exchange, and you can read the exchange plus the original paper here.

In his reply, Nicholls argues that our . . .
 . . . normalization does not take into account several factors that may have led to a reduction in vulnerability over the period they examined.
His hypothesis is that our normalization has an unaccounted-for bias in it, which is a common initial reaction to normalization studies. We welcomed Nicholls’ commentary because it motivated us to perform some additional empirical work that ultimately added additional evidence in support of our original paper.

Nicholls’ first point is that our use of state-level data may mask an increasing trend of vulnerability-decreasing urbanization at the expense of rural development.  He provides uncited data to suggest that:
In 1958 about 45% of the population of the State of Victoria lived outside the capital city Melbourne. By 2008 this proportion had fallen to about 25%.  
Official Australian government data tell a different story, as we explain:
Our estimate of population distribution change is not as dramatic as that reported by Nicholls: according to the Australian Historical Population Statistics [available from the Australian Bureau of Statistics (ABS) online at http://www.abs.gov.au], the proportions of the population outside each of the capital cities of Victoria, New South Wales, and Tasmania in 1958 were 37%, 45%, and 68% and equivalent figures for 2007 (the latest year for which data were available) were 27%, 37%, and 58%. 

If we adopt the ABS classification on urban and rural dwellings (the ABS defines urban areas to be those with 1000 or more people), the change of rate is even less pronounced over a similar timeframe: the proportions of rural dwellings in Victoria, New South Wales, and Tasmania in 1961 were 16%, 15%, and 33%, only slightly decreasing to 11%, 11%, and 29% in 2006 (dwelling data are contained in the census of population and housing and are available from the ABS). Note that these three states of southeast Australia account for over 90% of total normalized building damage.

Table 1 and Table 2 in our original paper provide evidence that the rate of growth of bushfire-prone structures in the regions affected by the 1983 Ash Wednesday and 2009 Black Sunday fires actually exceeded state-level growth.  To the extent that this is representative of a broader trend this would make our normalization conservative.
The figure from the top of this post can be found in our reply and has this description: "Aerial view of northern Sydney showing the highly dissected and complex interface (red line) between bushland (dark green) and urban areas."

A second point that Nicholls makes is to speculate that policy and homeowner actions might have lead to a nationwide reduction in structural vulnerability to fire. We reply that there is no need to speculate on this subject as the peer-reviewed literature on this subject suggests a different outcome -- specifically that vulnerability has in fact not decreased and the opposite may be the case (see our citations of Buxton et al. (2011), Chen and McAneney (2004) and Crompton et al. (2010)).

Nicholls’ third point is that that improved emergency preparedness may have led to a reduction in vulnerability.  Again, the evidence tells a different story.  For instance (and see our response for further discussion and examples):
As demonstrated in 2003 in Canberra and again in 2009 in Bendigo, Horsham, and Narre Warren (Whittaker et al. 2009), many whose homes were destroyed were unaware that they were at any risk from bushfires.
Nicholls’s final point is that improved fire weather forecasts may have led to a reduction in vulnerability.  Again, we respond by looking at the evidence:
[T]here is little evidence from anywhere that weather forecasts materially influence property damage from extreme events, even if they do save lives. The weather conditions on Black Saturday were very well forecast and accurate warnings were issued to emergency responders, politicians, and the public prior to February 7. What Black Saturday clearly demonstrated is the reverse: that despite accurate weather forecasts and significant emergency/bushfire planning and response, there is always the potential for large-scale life and property loss.
We conclude our response with the following:
Our result—that there is no discernable evidence that normalized building damage is being influenced by climate change due to the emission of greenhouse gases— is not surprising, when you consider that bushfire damage is not solely a function of bushfire weather; far from it, in fact. Even given a gradual aggravation of bushfire weather due to anthropogenic climate change or other factors, a bushfire still has to be ignited. Once ignited, a bushfire then has to traverse the landscape and impact a populated area, where outcomes in terms of damage will be a function of the spatial disposition of dwellings with respect to the fire front, and especially distance of properties from the bushland boundary (McAneney et al. 2009). These factors all contribute a large degree of stochasticity to eventual event loss outcomes.

The Nicholls (2011) speculations are worthy of discussion but no evidence is presented to support these contentions. Moreover, the evidence that we are aware of and have presented here in relation to a potential bias in our normalization methodology and to the possible sources of reduced vulnerability does not undermine our findings in any way.
References:

Crompton, R. P., K. J. McAneney, K. Chen, R. A. Pielke Jr., and K. Haynes, 2010. Influence of Location, Population and Climate on Building Damage and Fatalities due to Australian Bushfire: 1925-2009. Weather, Climate, and Society, 2:300-310.

Nicholls, N., 2011. Comments on “Influence of Location, Population, and Climate on Building Damage and Fatalities due to Australian Bushfire: 1925–2009”. Weather, Climate, and Society, 3:61-62.

Crompton, R.P., K.J. McAneney, K. Chen, R.A. Pielke, Jr., and K. Haynes, 2011. Reply to the Nicholls (2011) comment on Crompton et al. (2010), “Influence of location, population, and climate on building damage and fatalities due to Australian bushfire: 1925–2009”. Weather, Climate, and Society, 3:63-66.

22 May 2011

EPL Season Long Contest Winners!

The results are in for the season-long EPL predictions contest:

Eric144 38
n-g 38
Max 40
Ian Blanchard 42
GSW 42
Sandbarrs 44
perransounds 44
Roger 48
bernie 48
Adrian 52
Mark 52
faithandenvironment 52
Maher s. Hoque 52
Craig 1st 58
Naïve 60
Lu 60
emowatt48 64
Reiner 64

Congrats to Eric144 and n-g for their joint victory!  Please contact me by email to give me your address where I should send the signed copies of The Climate Fix. n-g even got the top 5 exactly correct, and only 3 of 17 picked Man U to win it all.  Collectively, Tottenham and Chelsea proved the easiest to pick and West Brom and Birmingham the most difficult, accounting for 27.4% of the total error.

The good news is that 14/17 participants demonstrated skill over the naive forecast, suggesting that I need to develop a more rigorous naive prediction for next season.

For those wanting to participate next year, look back here for EPL and Bundesliga contests in August.  Thanks all!

19 May 2011

College Grads Can't Find Work, Employers Can't Find Skills

Here is a remarkable juxtaposition: Today, ManpowerGroup issued its sixth annual survey of global "Talent Shortage" finding a pronounced shortage of skilled workers around the world (PDF).  Also today, the New York Times has a front page story about the difficulties that college graduates have in the United States finding jobs.

There is obviously a policy problem here, and some of it has to do with what universities are (or are not) doing.

Here is an excerpt from the Manpower report:
As we enter the Human Age, when human spirit and potential will become the driving force behind enterprise and innovation, having the right people in the right place at the right time becomes more critical than ever. Yet, as the global economic recovery continues, employers report increased difficulty filling open positions, despite an apparent surplus of talent amid high unemployment.

This year, Manpower expanded its sixth annual Talent Shortage Survey not only to gauge where employers are having difficulty filling available positions, but also examine why organizations are facing a lack of talent and what they are doing to mitigate these challenges. The results reveal increased difficulty finding the right talent in the wake of global economic recovery with limited effort to systematically fill the gaps—and notable regional variances.

• ManpowerGroup research reveals employers in India, the United States, China and Germany report the most dramatic talent shortage surges compared to last year. In India, the percentage of employers indicating difficulty filling positions jumped 51 percentage points.

• Nearly one in four employers say environmental/market factors play a major role in the talent shortage—employers simply aren’t finding anyone available in their markets. Another 22% of employers say their applicants lack the technical competencies or “hard” skills needed for the job, while candidates’ lack of business knowledge or formal qualifications is the main reason identified by 15% of employers.

• Approximately three-quarters of employers globally cite a lack of experience, skills or knowledge as the primary reason for the difficulty filling positions. However, only one in five employers is concentrating on training and development to fill the gap. A mere 6% of employers are working more closely with educational institutions to create curriculums that close knowledge gaps.
Here is a figure from the NYT article showing employment success rates of recent college graduates:

The Manpower report lists the top 10 job areas for which shortages have been reported in the US (it also has similar lists for countries around the world):
1 Skilled Trades Workers
2 Sales Representatives
3 Engineers
4 Drivers
5 Accounting & Finance Staff
6 IT Staff
7 Managers/Executives (Management/Corporate)
8 Teachers
9 Secretaries, PAs, Administrative Assistants & Office Support Staff
10 Machinists/Machine Operators
Interesting but not surprising to me (as a professor in an environmental studies program) is that such "area studies" graduates do the worst.  According to the NYT:
Young graduates who majored in education and teaching or engineering were most likely to find a job requiring a college degree, while area studies majors — those who majored in Latin American studies, for example — and humanities majors were least likely to do so. Among all recent education graduates, 71.1 percent were in jobs that required a college degree; of all area studies majors, the share was 44.7 percent.
The issues cannot be boiled down to a simplistic and misleading distinction between so-called "hard" disciplines versus "soft" disciplines (in my courses I distinguish between "hard" disciplines and "difficult" disciplines, with my courses falling into the latter;-). Manpower makes clear that even in non-technical fields, such as sales, there is greater demand for skills such as, "Excellent oral presentation and communication skills, consultative approach: ability to read people, diagnose problems, critical thinking / problem-solving, first-rate organizational skills" just to name a few (PDF).  And in technical fields such as engineering and skilled trades such skills are also increasingly needed.

The consequences of the skills shortage ripple through the labor market:
An analysis by The New York Times of Labor Department data about college graduates aged 25 to 34 found that the number of these workers employed in food service, restaurants and bars had risen 17 percent in 2009 from 2008, though the sample size was small. There were similar or bigger employment increases at gas stations and fuel dealers, food and alcohol stores, and taxi and limousine services.

This may be a waste of a college degree, but it also displaces the less-educated workers who would normally take these jobs.

“The less schooling you had, the more likely you were to get thrown out of the labor market altogether,” said Mr. Sum, noting that unemployment rates for high school graduates and dropouts are always much higher than those for college graduates. “There is complete displacement all the way down.”
This is not an easy problem to address, but I am convinced that universities must be part of the solution. Expect more on this topic from this blog in the future.

The Politics of Fungibility

The Obama Administration is soon to decide on whether or not to approve the building of a new pipeline from Canada's oil sands to US refineries. Even though the proposed pipeline cuts through a bunch of Red (Republican) states, my prediction is that the Obama Administration will approve the pipeline, regardless of the opposition. The simple reason for this expectation is that the recovered petroleum will either go to the US or, if not the US, to China. The last thing that the President will want during an election campaign is to eschew close and secure Canadian oil while gasoline prices are a national concern.

According to the FT today:
The Canadian province of Alberta could be one of the seven largest oil producers in the world by the end of the decade, its energy minister has said, as he urged the US to back a pipeline project that is vital for the industry’s expansion plans.

Ron Liepert, Alberta’s minister of energy, said that the US “has to make a decision” about whether it wanted the province’s oil, and that he was “proceeding all-out to find alternative markets for our product”, particularly in China. . .

Mr Liepert said Chinese companies, which are already active in the oil sands, “will take every drop we are able to produce, in a heartbeat.”
The fungibility of global oil supply means that reducing use of a particular source of oil in one place simply means that it will be consumed in another, and oil from elsewhere will have to fill that gap. Here is a nice explanation:
While Kenneth Medlock III, energy expert at Rice University, understands the environmentalists’ hope of reducing carbon emissions, he insists that if the US does not import the oil sands, someone else will, noting there is a project underway to export it to the Pacific Basin. Canada would then provide more fuel to China, which would require less fuel from the Middle East. That Middle East fuel would go to Europe and the US would get more of its fuel from Africa:
The protests are not going to stop oil sands development. You have to think of the world as one big bathtub. It doesn’t matter which end of the tub you fill from, as long as you are adding supply. The oil is going to flow.
He is right. The oil is going to be produced, and somebody is going to buy it. If the environmentalists truly want to make a difference, perhaps their focus should switch to pressuring the oil industry to work harder to reduce the carbon footprint of oil sands’ development. Surely nobody can object to that?
Efforts to develop alternatives to oil consumption via technological innovation would be even better yet.

The Science and Politics of Wedges

To follow up on my post yesterday on the so-called "stabilization wedges" of Rob Socolow and Steve Pacala, I thought that it would be useful to revisit the substantive reasons why it is that the wedges analysis has beem so misleading in the climate debate. I thought it also would be useful to add a perspective on why it is that some parts of the environmental community go on the attack when it comes to those who have criticized the wedges.

NYU physicist Marty Hoffert provides a concise explanation as to the problems with the approach, when it is used to imply that we need 7 (or now 8) wedges to "solve" climate change for the next 50 years (emphasis added):
Pacala and Socolow (8) analyzed a scenario that envisioned stabilizing atmospheric concentrations of CO2 at 500 ppm within 50 years. They found that reaching that goal required the deployment of seven existing or nearly existing groups of technologies, such as more fuel-efficient vehicles, to remove seven “wedges” of predicted future emissions (the wedge image coming from the shape created by graphing each increment of avoided future emissions). Those seven wedges, each of which represents 25 gigatons of avoided carbon emissions by 2054, are cited by some as sufficient to “solve” climate change for 50 years (9).

Unfortunately, the original wedges approach greatly underestimates needed reductions. In part, that is because Pacala and Socolow built their scenario on a business as usual (BAU) emissions baseline based on assumptions that do not appear to be coming true. For instance, the scenario assumes that a shift in the mix of fossil fuels will reduce the amount of carbon released per unit of energy. This carbon-to-energy ratio did decline during prior shifts from coal to oil, and then from oil to natural gas. Now, however, the ratio is increasing as natural gas and oil approach peak production, coal production rises, and new coal-fi red power plants are built in China, India, and the United States (10).

The enormous challenge of making the transition to carbon-neutral power sources becomes even clearer when emissions-reduction scenarios are based on arguably more realistic baselines, such as the Intergovernmental Panel on Climate Change’s “frozen technology” scenario ( 11, 12). Capturing all alternate energy technologies, including those assumed within this BAU scenario, means that a total of ~18 of Pacala and Socolow’s wedges would be needed to curb emissions (13) (see the figure). And to keep future warming below 2°C . . . an additional 7 wedges of emissions reductions would be needed— for a total of 25 wedges (see the figure).
The total is even more than 25 wedges if you want to avoid using the oceans as a store of carbon dioxide or reduce emissions below 2010 levels.. The numbers that Hoffert presents in his perspective are the same as those that I present in The Climate Fix, under a similar analysis.

What does a "wedge" mean in more intuitive terms?  Jom Romm helps to explain:
. . . [one wedge] by 2050 would require adding globally, an average of 17 [nuclear] plants each year, while building an average of 9 plants a year to replace those that will be retired, for a total of one nuclear plant every two weeks for four decades — plus 10 Yucca Mountains to store the waste.
Romm thinks we need about 14 "wedges" of effort to stabilize carbon dioxide concentrations at 450 ppm, and if one wedge implies a need for 26 nuclear plants per year, then 14 wedges implies 26 * 14 = 364 plants per year, or the equivalent effort of one nuclear power plant per day of carbon-free energy.  If you accept Hoffert's arguments, then using Romm's conversion you'd need 26 * 25 = 650 nuclear plants worth of carbon free energy, or closer to 2 per day.

This task -- at either Hoffert's or Romm's level of effort -- is clearly impossible with today's technology.

One might think that the modern environmental movement is adept enough at simple math to accept this message and thus proceed to advocate policies consistent with our lack of technological capabilities, such as calling for a much greater commitment to innovation. While some have, of course, the loudest, most well funded and arguably most influential parts of the movement have strenuously resisted the notion that we do not have the technology needed to rapidly decarbonize our economy, preferring to hold on to the myth that -- in the words of the original "wedges" paper -- "Humanity can solve the carbon and climate problem in the first half of this century simply by scaling up what we already know how to do."  I discuss this myth at length in Chapter 2 of The Climate Fix.

What explains the environmental community's strict adherence to bad math and flawed policy?

In an overlooked part of Matt Nisbet's recent report titled Climate Shift (see his Chapter 2), Nisbet explained that the American environmental community decided to unite under a common set of strategies that were expressed in a  2007 report called Design to Win (PDF, I earlier critiqued the report as "Doomed to Fail").   This common approach would allow efforts to be coordinated and reinforcing. Central to the approach was the so-called wedges and the misleading notion that we have all the technology that we need, meaning that the challenge was one of shaping political will and public opinion. On the scientific advisory committee of the DTW report was Robert Socolow.

The 2007 Design to Win report explains that to 2030, 80% of emissions reductions could be achieved with existing technologies and at little or no cost (the figure is from the report):
The good news is that we already have the technology and know-how to achieve these carbon reductions – often at a cost-savings. Design to Win’s synthesis of the latest scientific and economic analyses, including the Stern Review, Vattenfall climate abatement map prepared by McKinsey & Company, and reports by the Intergovernmental Panel on Climate Change, concluded that about 80 percent of the needed mitigation – 25 gigatons of carbon – can be achieved with existing technologies (Figure 4). The key lies in rapidly deploying such technologies in our power plants, buildings, factories and vehicles, and improving land management practices.
The focus on deployment, often to derision if not the exclusion of the need for innovation, is still central to environmental messaging, even as the math of emissions reductions would seem obvious and the policy centerpiece of DTW -- cap-and-trade -- has failed comprehensively.

What explains the adherence to bad ideas in the form of bad policy?  I'm not entirely sure but it just so happens that groups such as the Center for American Process have been funded under the Design to Win strategy to spread its message. Apparently that includes a healthy dose of efforts to delegitmize alternative points of view and to poison what otherwise might be characterized as a healthy public debate over policy options.  To the extent that these efforts succeed, climate policy and the broader environmental movement suffer.

18 May 2011

Relative Carbon Dioixde Emissions in the US and Europe

A colleague asked me if I could gin up a graph showing relative carbon dioxide emissions for the US and Europe since 1990, with 1990 set to 1.0.  Here is that quickly-made graph; the data comes from the US EIA.  In case you are curious, the European countries with relatively lower emissions in 2009 than the US (off of the 1990 baseline) are Germany, the UK, Denmark and Sweden.  The picture looks different with a 2000 baseline.

When the Hurricane Drought Ends

The United States is currently in the midst of a remarkable streak. The figure above shows the number of days in between intense hurricane landfalls (S/S Category 3-4-5).  As of June 1st, the start of the 2011 hurricane season that streak will have reached 2,046 days the third longest on record, surpassed only by the 2,136 days between landfalls of October 11,1909 and August 17,1915 and the 2,231 days between September 8,1900 and October 18, 1906. The data comes from the ICAT Damage Estimator.  I'd be surprised if the US went through another hurricane season without an intense hurricane landfall, simply based on the well-tested methodology that says good luck can't last forever.

17 May 2011

Socolow: Wedges Were a Mistake

[UPDATE: Socolow responds to Joe Romm, and undercuts many of Romm's arguments (e.g., that a wedge should be thought of over a "few decades", no says Socolow, they were 50 years; that we can reduce emissions by efficiency alone, no says Socolow; that the wedges suggest 450 ppm -- much less 350 ppm -- is a realistic target, Socolow says no. Apparently, Socolow does not read Romm's blog (I've updated the below to acknowledge this).  Romm says he is unfamiliar with my views on climate policy, so I'll send him a link to my book with an invitation to offer a substantive critique.]

National Geographic has a pretty remarkable story up on the so-called "stabilization wedges" approach to reducing emissions (thx DM). The article has a number of lengthy quotes from Robert Socolow (co-author with Stephen Pacala on the paper) in which he says that their paper was misunderstood and misused by the advocacy community.  Socolow's comments reinforce a number of arguments that I make about the wedges in Chapter 2 of The Climate Fix.

Here are a few choice excerpts from the National Geographic article:
When the torrent of predictions about global warming got too depressing, there were Robert Socolow's "wedges."

The Princeton physics and engineering professor, along with his colleague, ecologist Stephen Pacala, countered the gloom and doom of climate change with a theory that offered hope.  If we adopted a series of environmental steps, each taking a chunk out of the anticipated growth in greenhouse gases, we could flatline our emissions, he said. That would at least limit the global temperature rise, he said in a 2004 paper in the journal Science.

The Princeton colleagues even created a game out of it: choose your own strategies, saving a billion tons of emissions each, to compile at least seven "wedges," pie-shaped slices that could be stacked up in a graph to erase the predicted doubling of CO2 by 2050.

It was a mistake, he now says.

"With some help from wedges, the world decided that dealing with global warming wasn't impossible, so it must be easy," Socolow says.  "There was a whole lot of simplification, that this is no big deal."

He said his theory was intended to show the progress that could be made if people took steps such as halving our automobile travel, burying carbon emissions, or installing a million windmills. But instead of providing motivation, the wedges theory let people relax in the face of enormous challenges, he now says.
Socolow takes issue with how his work has been misused by advocates for action:
Socolow said he believes that well-intentioned groups misused the wedges theory. His theory called for efficiency, conservation, and energy alternatives that could keep greenhouse gas emissions at roughly today's levels, offsetting the growth of population and energy demands. Global temperatures would rise by 3°C.

"I said hundreds of times the world should be very pleased with itself if the amount of emissions was the same in 50 years as it is today," he said.

But those inspired by the theory took it farther.  If Socolow's wedges could stabilize emissions with a 3-degree rise, they said, even bigger wedges could actually bring greenhouse gases back down to a level resulting in only a 2-degree rise. (This is the goal that 140 nations have pledged to try to achieve in the Copenhagen Accord.)

"Our paper was outflanked by the left," Socolow said.  But he admits he did not protest enough: "I never aligned myself with the 2-degree statement, but I never said it was too much."

In holding out the prospect of success, adherents stressed the minimal goals, and overestimated what realistically could be achieved.

"The intensity of belief that renewables and conservation would do the job approached religious," Socolow said.
Of course, no one has abused the "wedges" analysis more than Joe Romm who did exactly what Socolow is critical of -- Romm super-sized each one of the wedges, doubled the number needed, and then claimed based on his perversion of the Socolow/Pacala analysis that we have (or soon will have) all the technology needed to stabilize concentrations of carbon dioxide at 450 (or even 350) ppm. It is hard to imagine that Socolow's comments can be in reference to anyone other than Romm, who has probably done more to confuse issues of mitigation policy than anyone [UPDATE: Socolow says he is unfamiliar with Romm's views.].  Back in 2008 I pointed out Romm's egregious misuse of the wedges analysis to imply that achieving deep emissions cuts would be technologically and economically possible with technologies currently (or soon to be) available. The exchange with Romm was precipitated by a paper with Chris Green and Tom Wigley in which we explained how the IPCC had made a similar error in its analysis (here in PDF).

Socolow's strong rebuke of the misuse of his work is a welcome contribution and, perhaps optimistically, marks a positive step forward in the climate debate.

Lack of Quality in University Education

Over the weekend Richard Arum and Josipa Roksa has a provocative, and I think spot on, op-ed in the New York Times on the poor quality of university education in the United States. They write:
Over four years, we followed the progress of several thousand students in more than two dozen diverse four-year colleges and universities. We found that large numbers of the students were making their way through college with minimal exposure to rigorous coursework, only a modest investment of effort and little or no meaningful improvement in skills like writing and reasoning.

In a typical semester, for instance, 32 percent of the students did not take a single course with more than 40 pages of reading per week, and 50 percent did not take any course requiring more than 20 pages of writing over the semester. The average student spent only about 12 to 13 hours per week studying — about half the time a full-time college student in 1960 spent studying, according to the labor economists Philip S. Babcock and Mindy S. Marks.

Not surprisingly, a large number of the students showed no significant progress on tests of critical thinking, complex reasoning and writing that were administered when they began college and then again at the ends of their sophomore and senior years. If the test that we used, the Collegiate Learning Assessment, were scaled on a traditional 0-to-100 point range, 45 percent of the students would not have demonstrated gains of even one point over the first two years of college, and 36 percent would not have shown such gains over four years of college.
The op-ed summarizes arguments found in their recent book, Academically Adrift: Limited Learning on College Campuses. Inside HigherEd provides some more details:
The main culprit for lack of academic progress of students, according to the authors, is a lack of rigor. They review data from student surveys to show, for example, that 32 percent of students each semester do not take any courses with more than 40 pages of reading assigned a week, and that half don't take a single course in which they must write more than 20 pages over the course of a semester. Further, the authors note that students spend, on average, only about 12-14 hours a week studying, and that much of this time is studying in groups.

The research then goes on to find a direct relationship between rigor and gains in learning:

Students who study by themselves for more hours each week gain more knowledge -- while those who spend more time studying in peer groups see diminishing gains.

Students whose classes reflect high expectations (more than 40 pages of reading a week and more than 20 pages of writing a semester) gained more than other students.

Students who spend more time in fraternities and sororities show smaller gains than other students.

Students who engage in off-campus or extracurricular activities (including clubs and volunteer opportunities) have no notable gains or losses in learning.

Students majoring in liberal arts fields see "significantly higher gains in critical thinking, complex reasoning, and writing skills over time than students in other fields of study." Students majoring in business, education, social work and communications showed the smallest gains. (The authors note that this could be more a reflection of more-demanding reading and writing assignments, on average, in the liberal arts courses than of the substance of the material.)

In section after section of the book and the research report, the authors focus on pushing students to work harder and worrying less about students' non-academic experiences. "[E]ducational practices associated with academic rigor improved student performance, while collegiate experiences associated with social engagement did not . . .
Meantime, and perhaps paradoxically, the US is facing a shortage of highly skilled workers:
Even with unemployment near 9%, manufacturers are struggling to find enough skilled workers because of a confluence of three trends.

First, after falling for more than a decade, the number of U.S. manufacturing jobs is growing modestly, with manufacturers adding 25,000 workers in April, the seventh straight month of gains, according to payroll firm Automatic Data Processing Inc. and consultancy Macroeconomic Advisers. The Labor Department's jobs report on Friday is expected to show moderate employment growth in the overall economy.

Second, baby-boomer retirements are starting to sap factories of their most experienced workers. An estimated 2.7 million U.S. manufacturing employees, or nearly a quarter of the total, are 55 or older.

Third, the U.S. education system isn't turning out enough people with the math and science skills needed to operate and repair sophisticated computer-controlled factory equipment, jobs that often pay $50,000 to $80,000 a year, plus benefits. Manufacturers say parents and guidance counselors discourage bright kids from even considering careers in manufacturing.
There is a lesson here, are universities listening?

NYT on UK Climate Policies: Rah! Rah!

Apparently the New York Times could not find a single person to raise questions about the UK's recent commitment to increasing its long-term emissions reduction targets, preferring instead to cheerlead:
Britain is poised to announce some of the world’s most ambitious goals for reducing greenhouse gas emissions — a striking example of a government committing to big environmental initiatives while also pursuing austerity measures. . .

“This is an outstanding example of the kind of action by developed-world countries that’s needed to bring climate change under control,” said Bert Metz, an adviser to the European Climate Foundation, a group in Brussels that advocates lower emissions, and a former member of the Intergovernmental Panel on Climate Change. “It’s also really going to push the British economy in the direction of growth.”
Not shared with readers is the fact that the UK has badly missed its 2010 emissions reduction target, is expected to continue to miss its short-term targets and has set targets but not policies to achieve them.

Ah, details.

16 May 2011

Rajendra Pachauri 2.0

From Australia come some interesting quotes from Rajendra Pachauri, chairman of the IPCC, perhaps showing a new approach.

First, on recent extreme events:
"Frankly, it is difficult to take a season or two and come up with any conclusions on those on a scientific basis," Dr Pachauri said.

"What we can say very clearly is the aggregate impact of climate change on all these events, which are taking place at much higher frequency and intensity all over the world.

"On that there is very little doubt; the scientific evidence is very, very strong. But what happens in Queensland or what happens in Russia or for that matter the floods in the Mississippi River right now, whether there is a link between those and climate change is very difficult to establish. So I don't think anyone can make a categorical statement on that."
When given a chance to opine on Austalia's climate politics, he took  pass:
[W]hen it came to commenting on the state of Australian politics and climate change, Dr Pachauri played a straight bat literally.

Anticipating questions about whether Australia was doing enough, he said he had rehearsed his lines.

"Australia is not doing enough in cricket. About climate change, I just can't say."

He said the IPCC was "doing what we can" in relation to concerns about its reputation.

"We . . . are focused on producing the best possible reports that we can. It is really up to governments to take actions that are in their best interests and society at large."

On Attribution, A Response to Parmesan et al.

In the current issue of Nature Climate Change I have a correspondence in response to the commentary by Camille Parmesan and colleagues on attribution of specific biodiversity outcomes to human caused climate change. They argued:
The biological world is responding rapidly to a changing climate, but attempts to attribute individual impacts to rising greenhouse gases are ill-advised.
Keith Kloor had a nice discussion of that piece when it came out.

In my response, I focus on the underlying political incentives for such claims of attribution:
Parmesan and co-authors1 offer a welcome tonic to overstated claims that attribute various localized changes in biological systems to human-induced climate change. However, their Commentary is off target when it lays blame for the misguided focus on attribution on the Intergovernmental Panel on Climate Change (IPCC) “effectively yield[ing] to the contrarians’ inexhaustible demands for more ‘proof.’” As compelling as battle with the sceptics seems to be in virtually every aspect of the climate issue, the overstated role of attribution in the climate debate has a far more prosaic origin in the fundamental design of the Framework Convention on Climate Change.
Read the rest of my response here in PDF.

If you are interested in learning more about how the political context of climate policy creates incentives for claims of attribution, and how this hurts climate policy, please see Chapter 6 in The Climate Fix as well as these two papers:

Pielke, Jr., R. A. (2004), What is climate change?. Issues in Science and Technology 20 (4) 31-34.

Pielke, Jr., R. A. (2005), Misdefining ‘‘climate change’’: consequences for science and action. Environmental Science & Policy 8 (6) 548-561.

Return to Fantasy Island: Targets Wthout Policies

As I suggested last week, UK Prime Minister David Cameron has settled debate over UK emissions reduction targets in favor of adopting stronger targets far in the future. It seems that he is also interested in continued economic growth.

Meantime, back in the present Cambridge Econometrics reports that the UK has missed it 2010 emissions reduction target, despite the dramatic effects of the economic downturn. The report sees the UK continuing to miss its near-term targets, with the difference between target and performance increasing over time. Environmental groups continue to confuse the effects of a shrinking economic downturn with progress on reducing emissions.  Looking ahead, the UK economy is currently positioned to have economic growth or a continued reduction in emissions, but not both.  Emissions in 2010 increased sharply.

Cambridge Economietrics explains the basic dynamics at work here:
[T]here is now a firm policy commitment (made by the previous government and now endorsed by the Coalition government) but as yet no firm policies in place ...
Targets without policies.  Doesn't sound too promising, does it?

12 May 2011

Japan's New Emissions Math

Yesterday I posted up a short bit from the IEA which asserted that a turn away from nuclear power will make reducing emissions more difficult.  Here are some numbers explaining how this works in the case of Japan.

Earlier this week the Japanese Prime Minister Naoto Kan announced that Japan was no longer seeking to source 50% of its energy needs from nuclear power and terminated plans for 14 new nuclear facilities.  What might this decision mean for Japan's ability to meet its current carbon dioxide emissions reduction target of 25% below 1990 levels?  Here I use wind as a measuring stick, but solar or other technologies could easily be used to tell the same story.

Here is what the Japanese government said would be needed to reach its earlier 5% reduction target:
  • Construct nine new nuclear power plant plants, improve utilized capacity to 80% (from 60%).
  • Install 5 million kW of wind power plants (equivalent to approximately 34 units).
  • Install solar panels on 5.3 million homes (an increase of 2000% over current levels).
  • Increase the share of houses satisfying stringent insulation standards out of total newly built houses from 40% today to 80%.
  • Increase the share of next generation vehicle out of total sales of new vehicles from 4% (2005) to 50% (2020).
Replacing 9 nuclear plants (1 GW at 0.8 capacity) with wind would imply in round numbers about 10,000 2.5 MW wind turbines (at 0.3 capacity) (See The Climate Fix for details).  Currently, Japan has about 1,700 wind turbines.  These 10,000 new ones are on top of the 6,600 already in the assumptions, or an increase of about 10 times current levels of deployment.

And this is just for the 5% reduction target.  You'd have to multiply by more than 5 to get to the 25% reduction target.

Japan currently gets 24% of it energy needs from nuclear power.  To replace that additional 26% that was supposed to come from nuclear (to get to 50%) implies 78,000 (!) 2.5 MW wind turbines (see TCF, p. 144, Table 4.4). The Japanese Wind Energy Association optimistically foresees 11.1 GW of capacity by 2020, or less than half that would have been needed to reach the 5% reduction target.  Abandoning nuclear does not make the emissions reduction targets easier, but far, far more difficult. 

I have argued that Japan's 2020 emissions reduction target of a 25% reduction was always far out of reach.  I don't think that the phrase "even more impossible" makes much sense, but perhaps Japan's new political context will at least make its emissions reductions commitments "even more obviously impossible."  Then again, there is always the appeal of magical solutions.

11 May 2011

Preview of IEA Low Nuke Scenario

The FT shares a preview of the IEA's "low nuclear" scenario forthcoming in November:
The IEA will publish its annual global energy outlook in November, which will include a “lower nuclear case” mapping what would happen if the world added only 180 gigawatts of new nuclear capacity by 2035, rather than the 360 predicted in the last outlook.

It found three consequences:
  1. The world will make up for the shortfall with gas, renewables and coal (coal and gas in China, gas and renewables in Europe and the US). That would put pressure on gas and coal prices, which would in turn make electricity generation more expensive.
  2. There will be less diversification of the energy mix, which is bad news for energy security.
  3. There might be about half a gigatonne more CO2 in the atmosphere, making climate targets much more difficult to achieve.
Keith Kloor and Shellenberger/Nordhaus each have thoughtful discussions of the current politics of nuclear power and consistent with the IEA preview.  There are no easy answers. 

David Keith on the APS Report

David Keith of the University of Calgary (pictured above) has posted up a detailed response to the APS air capture report (here in PDF).  Here is how it starts out:
The APS report does a fine job explaining the physics and chemistry of air capture (AC). We are pleased that the APS assembled such an excellent and diverse group of researchers to assess this technology at such an early stage in its development.

Our view about the costs of air capture diverges substantially from that of the APS. We acknowledge our bias and self-interest as developers of AC technology, but we believe that many of the key claims made here are independently verifiable.

The APS cost estimate for direct air capture depends on: (1) the performance of the APS reference design, (2) the choice of the reference design itself, and, (3) the validity of the cost estimating methods applied to the reference design.

We concur that the methodsused for cost estimation conform to industry practice, and applaud the report for laying these out in a transparent manner. In these slides we offer comments on items 1 and 2. In summary:

1.There are substantial technical discrepancies between APS’s performance model and estimates produced by industry-standard reference sources.

2.The choice of the APS’s reference design results in very high costs without a clear justification in terms of the tradeoff between cost and technical risk.
Keith also shares these comments with me by email, posted with his permission:
We have a paper under review that makes these points are more systematic way of lots of analysis.

Showing that a silly design is expensive proves nothing. To show that air capture expensive you need to show that a SENSIBLE design is expensive. That is, a sensible design at full industrial scale within a decade using low risk hardware.

An analogy: Suppose you are to estimating the cost of passenger jet traffic across the Atlantic in 1955. You need choose an aircraft design analogy as a starting point from which to scale and adjust your estimate. If you start with an F104 as your base case, one person per aircraft, you will get an foolish estimate about the actual cost per passenger of the 707 which was cheap because it was big and simple.

Starting with packed towers with stainless packing when there are commodity systems now on the market that will do the job for NaOH air capture is like starting with the F104. The APS did a fine job (ignoring the mistakes) costing the F104, but the choice of reference drives the answer.

Why did they not try several different systems and evaluate the cost-risk tradeoff? It's a mystery to me.
One thing here is for certain -- competition is good for both innovation and understanding.  Let's hope that it continues.

UPDATE 5/12: The following comments were shared with me by email for posting from Dr Tim Fox, Head of Energy and Environment, Institution of Mechanical Engineers, London, UK
I have read this report from the APS in full and its findings are based primarily on a detailed assessment of a single direct air capture technology proposal that has an energy intensive process at its core. The study also draws conclusions that largely ignore current thinking on the possible uses of this approach in climate change mitigation.

Air capture machines and processes represent a promising technology for the deployment of additional methods in tackling the challenge of global warming. However, given the early-stage and often proprietary nature of research and development in this area there is considerable uncertainty as to future cost levels. It is therefore premature to draw decisive conclusions as to the likely cost per tonne of CO2 captured. There are a number of other studies that suggest costs will be substantially lower than quoted by the APS and, indeed, given research, development and demonstration on a scale comparable with ‘conventional’ carbon capture and storage technologies (CCS) the approach might in certain circumstances gain near parity with that abatement option. We should however be clear that nobody is today seriously suggesting that in the case of power plant and large industrial emissions sources direct air capture would be considered as an alternative approach to CCS.

Air capture has the useful characteristic of being able to take advantage of the fact that direct removal of CO2 from the atmosphere can take place at any geographical location regardless of the point at which the gas is emitted. This enables the approach to be potentially used to tackle difficult sources, including large numbers of non-stationary emitters such as vehicles, aircraft and ships, or widely dispersed small scale industrial processes that are not amenable to CCS on technical or economic grounds. The technology also offers a wide range of other mitigation strategies including the removal of historic emissions accumulated in the atmosphere.

In summary, given the small amount of publicly available research and assessment data on proposed direct air capture technologies it is premature to draw any meaningful conclusions on the likely costs associated with the approach. As with CCS, the most promising technologies in this area need pilot testing and demonstration in the public domain to enable detailed cost assessments to be made, which will facilitate informed evidence-based decision making.