On March 8, 2013 a paper was published in Science Magazine by Marcott et al http://www.sciencemag.org/content/339/6124/1198.abstract
They claimed, using 73 global proxies, they were able to increase the temperature reconstruction from the typical past 1500 years all the way back to the past 11,300 years. And they also reported that the recent rate of warming (past 150 years) is “unprecedented” in the entire time frame. Their graph is shown below:
(click to enlarge)
You can see the dramatic increase in temperature (purple line) that is clearly -shockingly- different in the past 150 years. This gives strong evidence of human caused climate change driven by the industrial revolution and the burning of fossil fuels.
Predictably, the press trumpeted the new study:
“The modern rise that has recreated the temperatures of 5,000 years ago is occurring at an exceedingly rapid clip on a geological time scale, appearing in graphs in the new paper as a sharp vertical spike.”— Justin Gillis, New York Times
“Rapid head spike unlike anything in 11,000 years. Research released Thursday in the journal Science uses fossils of tiny marine organisms to reconstruct global temperatures …. It shows how the globe for several thousands of years was cooling until an unprecedented reversal in the 20th century.” – The Associated Press
“What that history shows, the researchers say, is that during the last 5,000 years, the Earth on average cooled about 1.3 degrees Fahrenheit — until the last 100 years, when it warmed about 1.3 degrees F.” – National Science Foundation
“We’re screwed: 11,000 years’ worth of climate data prove it.” — The Atlantic
Well this, of course, was quite an attention getter for those in the climate community and those of us who follow it. Questions were asked, first being, how was the data derived? The 73 proxies were collected from various previous research studies, 31 of these were from ocean layer alkenones that come from phytoplankton. There are chemical properties in the layers that can correlate to temperature. These can give a low resolution temperature profile (averaged over several centuries) of the distant past. One problem with these is when the cores are drilled, the top layers are usually destroyed and scientists must carefully mark and date where the data becomes useful and robust. Typically very few of these proxies can be used for the 20th (or 21st) century, they are better for averaged trends of ocean temps dating back farther.
Had Marcott used this data as published and dated, his chart would have had no sharp uptick. In fact, this is the same data he used for his PhD thesis at Oregon State University in 2011, and his chart then shows no skyrocketing temperature in the past 150 years:
So what happened?
It turns out Marcott RE-DATED a number of the core tops, changing the value of the temperature in the past 150 years and thus CREATED the huge spike in temperature. When challenged he finally confessed, “[The] 20th-century portion of our paleotemperature stack is not statistically robust, cannot be considered representative of global temperature changes, and therefore is not the basis of any of our conclusions.”
So that means his whole study shows nothing other than the globe has been slowly cooling for the past 7000 years with bumps and dips, which we already knew. All the press releases were wrong.
Marcott is now trying to defend his original claim saying that if you overlay our current 20th century thermometer records onto the end of his proxy chart it shows a dramatic spike in temperature. But you can’t just attached two completely different types of records and look at the trend line. His chart is made with low resolution data smearing all the temperature fluctuations over years or even centuries into a single layer with a single datapoint. You can’t compare that to temperature data now taken continuously with computers.
This is a common problem. People using our current computer monitored temperature record and comparing it to records in the past. We have low resolution data for the distant past. The proxies take an average temperature over years or decades or centuries. We have medium resolution data pre-computer era, even the best methods in the recent past used thermometers and were recorded by a person once a day during what they thought was the hottest part of the day. Today, we have high resolution data. A computer continuously monitors the temperature to the 10th of a degree and if it even briefly hits a higher number it is recorded. Obviously we will have lots more “record” highs with this method.
A good metaphor for how comparing this different resolution data will create a “hockey stick” spike is to count the number of curves in Interstate 80 from San Francisco to New York. You start out with low resolution data that averages, do this by looking at a map where 1 inch = 100 miles. This is like looking at a proxy (like ocean layers or ice layers) where one layer = 100 years. Now count the number of curves (each red dot is a curve):
When I counted and did the calculation I got about 0.05 curves /mile (or 1 curve every 20 miles).
But now let’s change to high resolution data. As we approach New York, change your resolution so one inch = 1 mile. This is like changing to using computer monitored temperature data:
I now calculate 1.25 curves/mile (or 1 curve every 0.8 miles). OH MY, EXTREME HIGHWAY CHANGES!
Both measurements are “scientific”. But you can’t just directly compare one data set to the other. The road didn’t suddenly become 25 times twistier, you’re just looking at it with more resolution.
We don’t have EXTREME weather today; we are just looking at it with more resolution than in the past. 1000 years from now, if we pull up the ice cores or ocean layers, they will show a slow warming or slow cooling just like in the past, not an extreme spike.
Please comment: Click "Comment", write comment, on comment as dropdown click "name/url", enter name on top line, hit "continue", hit "post"