Posted by & filed under CSAG Blog, Frontpage.

The location, variance and shape of statistical climatology

As the curtain falls on the 12th International Meeting on Statistical Climatology (IMSC), it is time to spend a few moments to reflect on the past week.

Location

Every three years, the world’s statisticians and climate scientists get together to share results, methodologies and insights in a joint effort to improve the rigor of climate research and develop novel approaches to applied statistics. This year the meeting took place on the beautiful, if a little cloudy and humid, island of Jeju in South Korea. Aside from the multitude of Korean and Chinese tourists, the island also attracts scientists to make use of the excellent conference facilities. The International Conference Centre (shown below when the sun was shining) was our venue for the week.

The conference centre was part of a larger area known as the Jungman Resort which contained a number of hotels, eateries and even a teddy bear museum to entertain the discerning traveller. The CSAG contingent, and indeed many other participants, stayed in Hotel Hana which was more than adequate living up to its online promise “We will try to become a hotel that does its best, even though we are not the top”.

Variance

The IMSC meeting provides a forum for established and early career scientists to learn about a range of topics and methods being applied across the world. From the investigation of “medicanes” (Mediterranean hurricanes – yes they do exist) to the analysis of Greenland’s tip-jets and results of the latest CMIP-5 simulations, this year’s IMSC didn’t disappoint in providing a wide ranging programme. As with any conference, there are always highlights and inevitably a few opportunities to take a much needed 15 minute snooze but in the spirit of positive biases, I will focus my attention on what I found to be the more fascinating and provocative presentations.

One of the first talks to spark interest amongst the audience was by Kate Willett from the UK Met Office. She invited climate scientists and statisticians alike to the “playground” of the International Surface Temperature Initiative. The project aims to provide benchmarked climate data and has designed a novel methodology to uncover and assess biases in the surface temperature record, assigning scientists to three teams called team creation, team corruption and team validation.

Much of my work involves analysing different sources of climate change uncertainty so I was glad to have the opportunity to chat to Ed Hawkins from the University of Reading; his highly cited 2009 paper with Rowan Sutton made several appearances at this year’s conference. In his talk, he commented on the potential causes of the global mean temperature “hiatus” since 1998. This led to a lively discussion with Hans von Storch which certainly made for an entertaining watch; those who were there will no doubt have an opinion about who came out on top! For me, this type of encounter illustrates the value in the IMSC meeting. The statistics community has a hugely important role in holding climate scientists to account and ensuring that scientists don’t oversell their results. However, climate scientists also have a responsibility to inform statisticians about the value of certain statistical approaches given knowledge of the relevant physics which often yields certain statistical methods inappropriate. It is in this context that I particularly liked a comment in one of Douglas Maraun’s talks where he said “significance is not relevance”.

Each morning, the conference started with a series of invited plenary talks and then proceeded with parallel sessions. Given the relevance of many talks, inevitably I missed some interesting presentations. The topic of detection and attribution received a large amount of attention and unfortunately I missed two apparently very interesting talks on this topic by Daithi Stone, who needs no introduction to CSAG. A neat talk by a scientist from MetoSwiss, that I fortunately did attend while other CSAG members were busy elsewhere, addressed model “genealogy” commenting on the ancestry of models. His conclusion was that models which are closer in their ancestral history (e.g. built using shared code) do in fact produce closer results and therefore this might have implications for how we weight models or assess model spread.

On the final day, a talk which particularly captured my imagination came from Chris Ferro based at the University of Exeter. He presented results from a paper currently in press titled “On judging the credibility of climate projections”. The talk involved audience participation and he asked people to make predictions about the relative errors of models and observations. The context was decadal forecasting but the results have more general implications for how we might judge the predictive capabilities of climate models on a number of different time scales.

Shape

It doesn’t take long for conference fatigue to set in and by mid-week I was beginning to wonder whether I was really getting all that much from the conference. However, in the last couple of days, and notably in Beijing airport on the way home, I had several very interesting conversations/debates with CSAG colleagues and the other meeting participants. Many of the results and insights from the conference became focal points in these discussions and I realised that I had learned a great deal. The talks and posters all seem interesting enough at the time but it is only when you discuss the results with colleagues that you begin to appreciate how different concepts and methodologies affect your thinking. Indeed, it may be a cliché but the most memorable and rewarding elements of attending conferences are not the presentations themselves but the lively debates that follow. This was certainly the case for IMSC 2013 and hopefully this will remain the case for IMSC meetings in the years to come. Incidentally, the 2016 meeting will be in Vancouver… these conference organising committees sure do know how to pick a good location!

So is the world of statistical climatology in a better shape owing to the last few years of methodological advances and improvements in climate analysis? Yes. Yes it is….well at least according to my analysis it is to the 90% confidence level.

3 Responses to “CSAG at IMSC 2013”

  1. Stefaan Conradie

    I also find the statement “significance is not relevance” rather “significant”. I remember one of our Statistics lecturers regularly reminding us that “statistical significance does not equate to practical significance” – which appears to me to be a statement in a similar vein. As a result I was quite surprised when I started exploring literature in various fields where statistical significance tests are conducted and finding that it appears to be very common to equate these two ideas at least to some extent. Is this the case in other people’s experience as well?

  2. Joseph

    I think this is because confidence is a scarce commodity in climate analysis! I agree with your imlpied sentiment that climatologists should strive to use higher confidence levels. Indeed, as a community, there is a need to better recognise what levels of confidence might be required to support statements about climate variability and climate change.

  3. Stefaan Conradie

    “to the 90% confidence level” – I find it interesting that, while, as far as I can tell, in most applications of statistics 95% or 99% confidence intervals are used, in climatology we generally use the 90% confidence level as a threshold. Is there any particular reason for this?