Top Menu

NOAA’s Climate Scientists Predict the Future by Looking at the Past

Raymond McIntyre

NOAA is a scientific agency of the United States federal government responsible for monitoring our climate and our environment and taking steps to preserve them.

NOAA is a scientific agency of the United States federal government responsible for monitoring our climate and our environment and taking steps to preserve them.

Climate scientists at the National Oceanic and Atmospheric Administration (NOAA) are going back over the records of meteorologists from the 19th and 20th centuries. The intent is to feed the information gathered by these unsung climate heroes into a pair of supercomputers and then to compare yesterday’s climate with today’s.

They hope to find out whether there are more extreme weather events now than there were in the late 19th and early 20th centuries and also whether or not today’s climate models have any relationship with meteorological events of the past. If they do, scientists can trust that those same models accurately predict global warming.

According to Gil Compo who heads the project, “It’s that connection to our scientific forbears that makes this project so exciting. These guys went out at tremendous risk to life and limb to collect the data. Now, in ways they never could have imagined, we are going to use data from those dedicated meteorologists to figure out how much weather and climate have changed over the past 150 years.”

This project has pulled together thousands of differing weather records gathered by weather-balloons, aircraft and information gathered from surface temperatures, wind speeds and records of barometric pressures. This data was first analysed in the late 1930s when records covering around 40 years were retrospectively studied with a view to helping the Allied war effort.

Version one of the project covers the years 1908 to 1958. It takes the equivalent of 33,000 processors running non-stop for three months, Compo says. The model calculates four readings a day at 200-kilometer intervals around the world. The computing power needed to calculate each year of the project is phenomenal. It consumes the equal of 224 processors on Bassi, the IBM Power5 and Seaborg, the IBM Power3 at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC).

Version two is going to be even more challenging. It will cover the years 1871 to 2009 and will run on Franklin and Jaguar, the Cray XT5 at Oak Ridge. This version will draw on extra information data recorded by the explorers who tried to find the lost Franklin expedition. “They took pressure and temperature readings,” Compo says. “We’ll take their pressure observations from wherever they reported themselves as being. We’ll combine those with all the others taken that day within six hours and we’ll get the model.” It also will take into account extra variables, such as solar and carbon dioxide levels and volcanic aerosols, to help reconstruct the most accurate historic weather maps.

“This will give us the ability to compare how our state-of-the-art model correlates with historical weather data,” Compo says. “It should give us confidence in our understanding of how our weather will change in future decades.”

IEDRO wonders how much more powerful this model would be if over 100 years of weather data was used and the data was collected worldwide.

Reference:

http://www.aoml.noaa.gov/hrd/data_sub/re_anal.html

, , ,

Comments are closed.