Wednesday, January 19, 2011

Global Warming Debate Basics Part 2: How good is the Temperature Record?
David Stevenson - Caesar Rodney Institute

What you don’t know could cost you thousands of dollars a year. An increasing number of scientists and policy professionals have declared public policies to reduce carbon dioxide emissions are based on uncertain science. This notion was recently given energy when former Vice president Al Gore confessed his support for ethanol was a mistake, poorly motivated by his desire to garner support for his presidential aspirations.

The US House of Representatives plans to explore this with hearings where both sides of the research issues will be heard. This will be a first and a teaching moment for all of us - and our children. Since the US Environmental Protection Agency has declared carbon dioxide a pollutant, these hearings will have important consequences. Congress has refused to regulate CO2 and the EPA regulation is an attempt to go around Congress. We summarize here what to look for in the hearings.

Besides the debate over the cause of temperature changes, there is still a debate about the accuracy of the land based temperature record. Widespread systematic weather reporting began about 1880, coincidently about the time the industrial revolution began and CO2 levels began to rise. The reported global averages you often see published are “adjusted” temperatures.

A formula is used to adjust for station variability and measurement variability. Stations move geographically, thermometer styles change, time of day of measurement can change, and the surroundings of the station can change. The latest version of the adjustment formula was developed by Matthew J. Menne and collaborators at NOAA/National Climatic Data Center . Mr. Menne states his formula tends to make older temperature readings look colder and more recent temperature readings look warmer. In fact, he admits that global warming trends virtually disappear without these adjustments. So, the efficacy of the formula and the raw temperature readings are critical. Consider these problems:

· Incredibly, NOAA has refused to release the formula for outside review. Scientists have estimated how the formula works and have questioned its’ accuracy.

· The raw, global data, collected and stored at Britain’s University of East Anglia, is missing.

· NOAA has design standards and rates the weather stations for issues that affect accuracy such as nearby paving or heat exhaust. A station rated “3” to “5” can be off +/- 2⁰ F to 9 ⁰ F. Over 90% of the stations in the US are rated this poorly. Therefore, we are trying to find a 1⁰ F/century signal from data with an average error bar eight times as large!

· The UN study uses temperature proxies such as ice cores, tree rings, and sea level rise to support thermometer readings. Every time critical reviewers look more closely at the studies or the sample size is expanded, the global warming signal disappears. CRI will discuss this in more detail in a future report.

The latest version of the formula ignores the urban heat island effect where average temperatures are 2 to 10 degrees higher near cities . The dramatic impact of urbanization can be seen in the trend lines of adjusted temperature history from Milford, DE in rural Sussex County (0.9 ⁰F/century) and Newark, DE in New Castle County (2.6 ⁰F/century) with ten times the population density. The Milford data is confirmed from stations in Greenwood and Dover. Larger studies have similar results. For a copy of this document with charts and footnotes, please go to www.caesarrodney.org and go to issues, Center for Energy Competitiveness.

No comments:

Post a Comment