New Analysis Confirms: The Heat is On

New Analysis Confirms:  The Heat is On

The Economist says a new analysis of the
global average temperature record leaves little room for the doubters. Berkeley
Earth’s results, which were released on 20 October, offer strong support to the
existing temperature compilations. Over the past 50 years the land surface
warmed by 0.911°C: At a time of exaggerated doubts about the instrumental
temperature record, this should help promulgate its main conclusion: that the
existing mean estimates are in the right ballpark. That means the world is
warming fast.

The Economist (22 October 2011):

FOR those who question whether global warming
is really happening, it is necessary to believe that the instrumental
temperature record is wrong. That is a bit easier than you might think.

There are three compilations of mean global
temperatures, each one based on readings from thousands of thermometers, kept
in weather stations and aboard ships, going back over 150 years. Two are
American, provided by NASA and the National Oceanic and Atmospheric
Administration (NOAA), one is a collaboration between Britain’s Met Office and
the University of East Anglia’s Climate Research Unit (known as Hadley CRU).
And all suggest a similar pattern of warming: amounting to about 0.9°C over
land in the past half century.

To most scientists, that is consistent with
the manifold other indicators of warming—rising sea-levels, melting glaciers,
warmer ocean depths and so forth—and convincing. Yet the consistency among the
three compilations masks large uncertainties in the raw data on which they are
based. Hence the doubts, husbanded by many eager sceptics, about their
accuracy. A new study, however, provides further evidence that the numbers are
probably about right.

The uncertainty arises mainly because weather
stations were never intended to provide a climatic record. The temperature
series they give tend therefore to be patchy and even where the stations are
relatively abundant, as in western Europe and America, they often contain
inconsistencies. They may have gaps, or readings taken at different times of
day, or with different kinds of thermometer. The local environment may have
changed. Extrapolating a global average from such data involves an amount of
tinkering—or homogenisation.

It might involve omitting especially awkward
readings; or where, for example, a heat source like an airport has sprung up
alongside a weather station, inputting a lower temperature than the data show.
As such cases are mostly in the earlier portions of the records, this will
exaggerate the long-term warming trend. That is at best imperfect. And for
those—including Rick Perry, the Republican governor of Texas and would-be
president —who claim to see global warming as a hoax by grant-hungry
scientists, it may look like a smoking gun.

To build confidence in their methodologies,
NASA and NOAA already publish their data and algorithms. Hadley CRU is now
doing so. A grander solution, outlined in a forthcoming Bulletin of the
American Meteorological Society, would be to provide a single online databank
of all temperature data and analysis. Part of the point would be to encourage
more scientists and statisticians to test the existing analyses—and a group
backed by Novim, a research outfit in Santa Barbara, California, has recently
done just that.

Inconvenient data

Marshalled by an astrophysicist, Richard
Muller, this group, which calls itself the Berkeley Earth Surface Temperature,
is notable in several ways. When embarking on the project 18 months ago, its
members (including Saul Perlmutter, who won the Nobel prize for physics this
month for his work on dark energy) were mostly new to climate science. And Dr
Muller, for one, was mildly sceptical of its findings. This was partly, he
says, because of “climategate”: the 2009 revelation of e-mails from scientists
at CRU which suggested they had sometimes taken steps to disguise their
adjustments of inconvenient palaeo-data. With this reputation, the Berkeley
Earth team found it unusually easy to attract sponsors, including a donation of
$150,000 from the Koch Foundation.

Yet Berkeley Earth’s results, as described in
four papers currently undergoing peer review, but which were nonetheless
released on October 20th, offer strong support to the existing temperature
compilations. The group estimates that over the past 50 years the land surface
warmed by 0.911°C: a mere 2% less than NOAA’s estimate. That is despite its use
of a novel methodology—designed, at least in part, to address the concerns of
what Dr Muller terms “legitimate sceptics”.

Most important, Berkeley Earth sought an
alternative way to deal with awkward data. Its algorithm attaches an automatic
weighting to every data point, according to its consistency with comparable
readings. That should allow for the inclusion of outlandish readings without distorting
the result. (Except where there seems to be straightforward confusion between
Celsius and Fahrenheit, which is corrected.) By avoiding traditional procedures
that require long, continuous data segments, the Berkeley Earth methodology can
also accommodate unusually short sequences: for example, those provided by
temporary weather stations. This is another innovation that allows it to work
with both more and less data than the existing compilations, with varying
degrees of certainty. It is therefore able to compile an earlier record than
its predecessors, starting from 1800. (As there were only two weather stations
in America, a handful in Europe and one in Asia for some of that time, it has a
high degree of uncertainty.) To test the new technique, however, much of the
analysis uses the same data as NOAA and NASA.

Heat maps

In another apparent innovation, the Berkeley
team has written into its analysis a geospatial technique, known as kriging,
which uses the basic spatial correlations in weather to estimate the
temperature at points between weather stations. This promises to provide a more
nuanced heat map than presented in the existing compilations, which either
consign an average temperature to an area defined by a grid square or, in the
case of NASA, attempt a less ambitious interpolation.

It will be interesting to see whether this
makes it past the review process. Peter Thorne, a climatologist at the
Co-operative Institute for Climate and Satellites, in North Carolina, describes
it as “quite a hard sell in periods that are data sparse”. He adds: “That
doesn’t mean you can’t do it. It means you’ve got to prove it works.”

Two of the Berkeley Earth papers address
narrower concerns. One is the poor location of many weather stations. A
crowd-sourcing campaign by a meteorologist and blogger, Anthony Watts,
established that most of America’s stations are close enough to asphalt,
buildings or other heat sources to give artificially high readings. The other
is the additional warming seen in built-up areas, known as the “urban
heat-island effect”. Many sceptics fear that, because roughly half of all
weather stations are in built-up areas, this may have inflated estimates of a
temperature rise.

The Berkeley Earth papers suggest their
analysis is able to accommodate these biases. That is a notable, though not
original, achievement. Previous peer-reviewed studies—including one on the
location of weather stations co-authored by Mr Watts—have suggested the mean
surface temperatures provided by NOAA, NASA and Hadley CRU are also not
significantly affected by them.

Yet the Berkeley Earth study promises to be
valuable. It is due to be published online with a vast trove of supporting
data, merged from 15 separate sources, with duplications and other errors
clearly signalled. At a time of exaggerated doubts about the instrumental
temperature record, this should help promulgate its main conclusion: that the
existing mean estimates are in the right ballpark. That means the world is
warming fast.

Source: www.economist.com

Leave a Reply