## Amazing multiplying hockey stick proxies

February 3, 2010

In my previous post I wrote about the five super-simple steps for building a hockey stick:

Step 1. Gather time series.
Step 2. Select those time series that fit the instrumental (measured) temperature record of choice. Assume that since these time series match the measured temperature in some way, then they are, in fact, temperature proxies.
Step 3. Combine the chosen proxies in some fashion and note, not surprisingly, that the combined proxies match the temperature record. (duh) Call this your temperature reconstruction.
Step 4. Call this thing made from the combined proxies your temperature reconstruction, and therefore assume that the combined proxies are also a match for the temperature that occurred prior to the temperature measurement records.
Step 5. Note that the reconstruction shows that the temperature prior to the instrumental data is relatively flat, and conclude that the temperature prior to the instrumental record changed very little.

This post is about a little subplot in gathering of time series for the Michael Mann’s 2008 version of the hockey stick (Proxy based reconstructions of hemispheric and global surface temperature variations over the past two millenia, PNAS, 2008)

Mann used 1209 proxies for this reconstruction.  He explains the breakdown as follows…

We made use of a multiple proxy (‘‘multiproxy’’) database consisting of a diverse (1,209) set of annually(1,158) and decadally (51) resolved proxy series … including tree-ring, marine sediment, speleothem, lacustrine, ice core, coral, and historical documentary series. All 1,209 series were available back to at least A.D. 1800, 460 extend back to A.D. 1600, 177 back to A.D. 1400, 59 back to A.D. 1000, 36 back to A.D. 500, and 25 back to year ‘‘0’’ (i.e., 1 B.C.).

Figure 1. Northern Hemisphere proxies in alphabetical order

Mann split his analysis between the Northern and Southern hemispheres.  I am going to talk about the 1,036 of the 1,209 proxies that applied to the North.  The following two images show the plots of the these 1,036 proxies, just click on them to enlarge.  The file sizes are less than a megabyte each and should open quickly in your browser.  Figure 1 is the plots arranged in alphabetical order.  If you scroll through this image you will see a lot of proxies that don’t look much like a hockey stick, and a few scattered here and there that do.  However, there is a series of 71 proxies named lutannt1 through lutannt71 that look very much like hockey sticks.

These lutannt# proxies are from Pauling A Luterbacher, the researcher who “provided” them.  More on this important point later

Figure 2. All Northern Hemisphere proxies in order of correlation with Northern Hemisphere instrumental temperature record.

As explained in the five easy steps for hockey stick construction, the proxies that look much like a hockey stick are likely to be weighted heavily in the final hockey stick construction.  If all the 1,036 proxies are correlated (For the math inclined: see correlation formula below) with the northern hemisphere instrumental temperature record, and the plots laid out from the worst correlation to the best, it will look like figure 2.  Scroll through this figure from top to bottom.  You will see the worst correlations at the top and the best on the bottom.  Note that the Luterbacher proxies are among the best correlated, and show up near the bottom.

Figure 3. All Northern Hemisphere proxies, except Luterbacher proxies, in order of correlation with Northern Hemisphere instrumental temperature record.

Figure 3 is the same as figure 2, but with the Luterbacher proxies removed.  Scroll through, and it is quite clear that there are far fewer hockey stick-like proxies now.

## The Amazing Multiplying Proxies

Remember, the point of a hockey stick is not that it goes up in the 20th century – this is a given because the hockey stick is deliberately constructed from proxies that go up in the 20th century.  The real point is that it is more or less flat prior to the 20th century. (See step 5 of the super-simple steps for building a hockey stick.)  The 71 Luterbacher time series are tailor-made for this purpose, because they tend to show temperature rising in the 20th century but flat prior to that.  The problem with the 71 Luterbacher proxies is that they are actually not 71 separate proxies at all.

Luterbacher, et.al., (European Seasonal and Annual Temperature Variability, Trends, and Extremes Since 1500, Science, 2004) used about 150 “predictors” spread out over Europe to reconstruct European surface temperature fields.  These predictors consisted of “instrumental temperature and pressure data and documentary proxy evidence.”    Figure 4, taken from Luterbacher’s  supplemental material, shows the geographical distribution of these predictors.

Figure 4. Luterbacher's original caption: (A) station pressure locations (red triangles) and surface temperature sites (B, red circles) used to reconstruct the monthly European temperature fields (25°W-40°E; 35°N-70°N given by the rectangular blue box). Blue circles indicate documentary monthly-resolved data, blue dots represent documentary information with seasonal resolution back to 1500. Green dots stand for seasonally resolved temperature proxy reconstructions from tree-ring and ice core evidence.

Lutenbacher used combinations of the predictors to interpolate the data to…

“a new gridded (0.5° x 0.5° resolution) reconstruction of monthly (back to 1659) and seasonal (from 1500 to 1658) temperature fields for European land areas (25°W to 40°E and 35°N to 70°N).”

Each of these grid points in the reconstruction is like one of the lutannt# graphs that show up in the list of proxies for Mann’s 2008 version of the hockey stick.  Mann ends up with 71 lutannt# “proxies” by simply taking 71 points using 5° x 5° resolution from Luterbacher’s temperature field reconstruction.

Here’s  the rub: Not all the predictors used to make Luterbacher’s temprature field reconstruction go all the way back to 1500.  In fact, prior to about 1760 only about 10 of the total 150 predictors are used, and these predictors are primarily “documentary information.”  Prior to about 1660, only about 7 are used.   Figure 5, which also comes from Luterbacher’s supplementary material, shows the number of predictors used for each year to reconstruct his surface temperature fields.

Figure 5. Luterbacher's original caption: Number of predictors through time.

Figure 6 shows the location of Mann’s 71 selected “proxies” and the location of the “documentary information” sources.  Not the best match in the world, is it?  Amazingly, the construction of some of the proxies prior to 1750 is based on data from sources over 1000 kilometers away!

Figure 6. Blue dot show the location of Mann lutannt# "proxies." Red dots show the location of Luterbachers early "documentary information" sources.

The important point is that all the data for Mann’s 71 lutannt# “proxies” prior to about 1760 is made up of some combination of the same 10 or so “documentary information” predictors.  This short list of predictors are the “Amazing Multiplying Hockey Stick Proxies.”  These 10 predictors are multiplied into 71 proxies, and these proxies all rank high for correlation to the instrumental temperature record from 1850 to the present.  Consequently, these 71 “proxies” likely weigh heavily in Mann’s 2008 hockey stick, and these 10 “documentary information” predictors, sometimes folded into ”proxies” over a thousand kilometers away, have an undeserved multiplied effect in making the flat part of the hockey stick prior to the instrumental temperature record.

******************************************

correlation coefficient:

where P is the proxy, and Pi is the ith year of the proxy
T is the temperature, and Ti is the ith year of the temperature

## 2 to 1 odds for Prof. David Barber

August 22, 2009

We are well into summer and the Arctic ice extent and area are taking their annual plunge.  How deep will the plunge be?  David Barber of the University of Manitoba thinks it will be very large.  Just a year ago he predicted that the the North Pole would be ice free in the summer of 2008.  National Geographic reported:

“We’re actually projecting this year that the North Pole may be free of ice for the first time [in history],” David Barber, of the University of Manitoba, told National Geographic News aboard the C.C.G.S. Amundsen, a Canadian research icebreaker.

It turned out that he was wrong.

The 2008 summer minimum turned out to have more ice than 2007′s minimum. But he has a fallback predicton: that the Arctic Basin will be ice free, at least part of the summer, by 2015.  This is a much more profound prediction.  The North Pole is just  a dot on the map, but the Arctic Basin is 4 million square kilometers surrounding the North Pole.

Last December I challenged Barber on this blog to wager over his 2015 prediction.  He has not taken me up on the offer.  Now I have doubled the odds for him.  One week ago (8/15/09) I sent him the following email:

Dear Prof. Barber,

I took great interest in your widely reported prediction that the Arctic Basin would see its first ice free summer in 2015. Last December I wrote a blog post in which I challenged you to a wager. That post can be seen here:

This post has been viewed thousands of times on both my website and on the sites of others who have re-posted it.

In that post I said:

“I propose a friendly wager based on this prediction. I will bet David Barber \$1000(US) that the ice covering the Arctic Basin will not be gone anytime before December 31st, 2015. The bet would involve no transfer of cash between myself or Barber, but rather, the loser will pay the sum to a charitable organization designated by the winner.

Definition of terms. The Arctic Basin is defined by the regional map at Cryosphere Today. “Gone” means the Arctic Basin sea ice area is less that 100,000 square kilometers, according to National Center for Environmental Prediction/NOAA as presented at Cryosphere Today . Charitable organizations will be agreed upon at the time the bet is initiated.

David Barber is a smart guy and evidently an expert in his field. Taking on a wager with an amateur like me should be like shooting fish in a barrel. I look forward to reaching an agreement soon.”

Perhaps you did not see that challenge online – but many other people did. I am now willing to give you two to one odds on the same wager. Are you interested?

Best Regards,
Tom Moriarty

That’s right.   I will put \$2000 dollars against Professor Barber’s \$1000.   It should be difficult for him to turn this down.  He can put that \$2000 dollars to any good cause that he desires.  If this sum is too small, perhaps we can nogotiate something larger.  He knows how to find me.  But I haven’t had a response yet.

One more point: The Arctic Basin is about 4 million square kilometers that roughly surround the North Pole.  If the Arctic Basin were ice free, then it would be a pretty good bet that all the arctic regions south of the Arctic Basin would also be ice free.  So Barber’s bet that the Arctic Basin will be ice free at some point by 2015 is effectively like saying the entire Arctic will be ice free.    Look at the AMSR-E plots of Arctic sea ice extent below.  Anybody interested in taking my wager?

Sea Ice extent for the Entire Arctic. If the Arctic Basin becomes ice free, then it is a good bet that the entire Arctic will also be ice free.

Sea Ice extent for the Entire Arctic. Ths is a detail from the graph above. If the Arctic Basin becomes ice free, then it is a good bet that the entire Arctic will also be ice free.

Why am I making this bet?   Because I am concerned about climate exaggerations and the effect they have on public policy makers. It seems quite clear that David Barber was off the mark when he predicted for 2008 “this year that the North Pole may be free of ice for the first time,” because neither the Arctic Ocean, the Arctic Basin nor the North Pole were ice free in the summer of 2008.  Same with the summer of 2009, so far.  And the Arctic Basin will not be ice free by 2015 either.

## I was (partially) wrong

August 20, 2009

I recieved  comment form the GM spokesman, Rob Peterson, about my last two posts lambasting the supposed 230 mile per gallon Chevy Volt.  Here is Rob’s comment  in its entirety.

This is Rob Peterson from GM.

Although the Volt has a 16 kWh battery, only 8 kWh is used. This will significantly impact the rest of your calculations and your synopsis. Please post a correction based on this fact.

As for the Volt’s city fuel efficiency rating of 230mpg – this is based on the EPA’s draft methodology. The same methodology which will be used for all other vehicles of this type.

r

I responded to Rob with two comments, which you can read here.  One of those comments questions his sincerity about “blaming” the 230 mile per gallon claim on the EPA.  However, he is essentially right about the the charging cycle of the 16 kWh battery only using about half of that.  He has asked me to “Please post a correction based on this fact.”  I have done so, but the final numbers for the vaunted Volt are still underwhelming.

Here is a table comparing miles per gallon, kWh per mile, and pounds of CO2 per mile for the Chevy Volt, the Toyota Prius, and the a couple of ancient Honda Civics.  You can read the details of how I derived the numbers for the Volt, using Rob’s partial capacity charge cycle scheme in the text below.  Note that the prices for the Honda Civics have been adjusted for inflation to 2009 dollars for a fair comparison.

Now that I have posted a correction, can I expect Rob Peterson to post a retraction of GM’s preposterous 230 mile per gallon claim?  Not Likely.

I  have not yet been able to find an official specification for the number of kilowatt-hours per mile for the Volt.  I am hoping Rob will provide one.  I have found Rob’s description of the charging scheme for lithium-ion batteries to be essentially correct.  That is, the battery is typically charged by the electrical grid to around 90% of total capacity.  Then the car is propelled entirely off of battery power until it reaches about 30% capacity.  This is known as the “charge depletion” mode.  When the battery gets to about 30% of capacity the gasoline powered generator kicks in and maintains the charge at about 30% capacity.  This is known as the “charge sustaining” mode.

Then, when the battery is plugged into the electrical grid it is recharged with grid energy from about 30% capacity back up to about 90% capacity    That is a range of about 60% of the total capacity.  So, for a 16 kilowatt-hour battery, a complete charge off the electric grid puts about 9.6 kWh (0.6 x 16 kWh) into the battery.  But an extra 10% or so is lost due to transmission line and battery conversion losses.  So the amount of power taken from the electrical grid will be about 10.6 kilowatt-hours. If that charge will propel the car for 40 miles, then that works out to 3.8 miles per kWh (or about 0.27 kilowatts per mile)

I cannot find the value of about 0.27 kWh per mile anywhere in the specifications for the Volt, but I did find this somewhat cryptic statement at Chevy.com:

“Under the new procedure, the EPA weights plug-in electric vehicles as traveling more city miles than highway miles on only electricity. The EPA procedure would also note 25 kilowatt-hours/100 miles electrical efficiency in the city cycle.

So, lets accept the value of 25 kilowatt-hours/100 miles (0.25 kWh per mile) for the moment.  What is the affect that this will have on the numbers I reported for CO2 emissions?

The number of pounds of CO2 emitted per mile while powering the car with gasoline (known as the “charge sustaining” mode) will remain unchanged.  There are 19.4 pounds of CO2 produced per gallon of gasoline burned, and GM claims 50 miles per gallon in “charge sustaining” mode.  So:

( 19.4 lbs of CO2 / gallon) / (50 miles / gallon) =
0.39 lbs of CO2 per mile

This 0.39 lbs of CO2 per mile for the Volt running on gasoline (charge sustaining mode) is the same as for the Toyota Prius, because it also gets 50 miles per gallon.

Here is the same calculation for my ancient 1988 Honda Civic hatchback that got 47 miles to the gallon:

( 19.4 lbs of CO2 / Gallon) / (47 miles / gallon) =
0.41 lbs of CO2 per mile

And for the 197887 Honda Civic Coupe HF, which got 57 miles per gallon:

( 19.4 lbs of CO2 / Gallon) / (57 miles / gallon) =
0.34 lbs of CO2 per mile

Lets assume now that the Volt uses 0.25 kilowatt-hours per mile (“25 kilowatt-hours/100 miles’) when running off of power provided to the battery by the electric grid (known as the “charge depleting” mode).  On the average the grid yields 1.34 pounds of CO2 per kilowatt-hour. The grid transmission losses and grid to battery conversion losses  add up to about 10%.  So the amount of CO2 yielded per mile will be:

(1.34 lbs of CO2 per grid kWh) x (0.25 kWh per mile)  x 1.1 =
0.37 lbs of CO2 per mile

Almost identical to the CO2 emitted when it is running off of gasoline (0.39 lbs of CO2 per mile).  And it is also nearly identical to the amount of CO2 per mile as the  much cheaperPrius generates while running off of gasoline.

But here it the rub.  If the Volt is driven in an area where the electricity is predominantly generated with coal (by far the most common source or electricity generation in the US), then the CO2 emissions go way up.  That is because Coal emits about 2.1 pounds of CO2 per kilowatt-hour generated for the electric grid.  So again we can asume 10% for the sum of the grid transmission losses and grid to battery conversion losses.  Then the amount of CO2 that the Volt yields per mile driven in a region where coal is the primary source of electricity will be:

(2.1 lbs of CO2 per grid kWh) x (0.25 kWh per mile) x 1.1 =
0.58 lbs of CO2 per mile

If we really concerned about reducing CO2 (I’m not), saving energy (I am), creating American jobs (I am), and saving money (I am), then we should support the production of an American car that is similar to a 1988 Honda Civic.  Why argue the merits of a \$40,000 car that few people will ever be able to afford?  A \$15000 dollar car that gets as good or better mileage and generates as little or less CO2 would be bought by millions and have a much greater impact.