Climate scientists, weather forecasters and policy makers are gathered in Geneva this week to discuss the need for reliable climate predictions to help society adapt to climate change.
The third World Climate Conference of the World Meteorological Organization, which runs from 31 August to 4 September 2009, aims to produce a new global framework for delivering climate information to end users.
Scientists at the conference are hopeful that with sustained support for climate research and improved computing capabilities they could reliably predict climate impacts at much higher resolution – perhaps down to several tens of kilometres over the coming decades. The ultimate goal , and one that was voiced at last year’s World Climate Modelling Summit in Reading (which we covered here) is to produce climate predictions that are as reliable and useable as weather forecasts.
That would be a vast improvement on the projections available from today’s global climate models. Most of these enable estimates of how temperature, and other climate variables such as rainfall, will change over areas of several hundred kilometres up until the end of the century and beyond.
While a large focus of the conference is on improving climate modelling in order to make reliable predictions, delegates in Geneva are also discussing the need to tailor information to the needs of specific end-users.
“A forecast in not enough; our challenge is to communicate what we know that the future in a manner that can allow people to make decisions”, said Gro Harlem Brundtland, special envoy of the UN secretary-general on climate change, at the opening session on Monday. At the end of the 5-day conference, delegates will issue a declaration of their intent to establish a new global framework to meet this challenge.
But major advances will be needed in the science of prediction before climate information is of real service to society. “In 10-15 years we may have climate forecasts like we now have weather forecasts”, said Guy Brasseur of the National Centre for Atmospheric Research (NCAR) in Boulder, Colorado to delegates here on Monday. “We have the international vision, expertise and scientific commitment to deliver climate services”, said Brasseur, but he also warned of the difficulties in developing climate models of sufficiently high spatial resolution and reliability.
One hurdle is the massive investment needed to fund one or more supercomputers; others include accessing data and sustaining long-term observations. Despite these, several attempts to improve predictive capability are underway worldwide. One of these, being championed by Tim Palmer of the European Centre for Medium-Range Weather Forecasts is the concept of ‘seamless prediction’, in which one modelling system is used to predict atmospheric conditions on time scales varying from hours to decades. Other efforts, being headed by scientists from the UK Met Office and elsewhere, are focused on how climate change will pan out in the coming decades, and combines aspects of seasonal forecasting with centennial prediction (which I won’t go into now, but hope to come back to). Still another approach is Earth System Modelling, which attempts to model the whole earth system – including feedbacks – more comprehensively than climate models, at spatial scales of 150km across.
Scientists at the conference are excited at the possibility that such efforts will lead to greater predictability, but are also concerned that end-users could have unrealistic expectations of what that means. “We’ll never be able to produce absolute predictions of what will happen in the future”, says Vicky Pope of the UK Met office. She says that scientists must work within a risk management framework so that people don’t misuse the data. “We are nervous about the uncertainties and errors associated with the models we are using”, says Jerry Meehl of NCAR, adding “That needs to be part of the message that gets out with climate services”.