The big data and climate FRONTIER: making sense of the explosive increase in climate data through smart designs and big data methods

The next frontier of regional climate change is not in producing more data, but in producing more information.

Climate information is becoming increasingly important for effective planning, adaptation and mitigation to future costs and disruptions arising from climate variability and change. However, this increasing demand for climate information is driving an explosive increase in the volume of climate data. This creates obstacles for users who need to access this data. Accessing such a large volume of climate data will require highly specialised skills and tools which can be unaffordable for many users. Therefore, an emerging challenge in climate change research is ensuring that climate information remains accessible to all users and stakeholders. FRONTIER aims to address this challenge through the key issues of data production and data analysis.

The primary objective of FRONTIER is to mitigate future challenges associated with the exponential increase in climate data expected over the next decade using smart design processes and Big Data methods.

Achieving this goal requires an interdisciplinary approach that draws on expertise and knowledge from the disciplines of climate and mathematics. As such, the methodology sits at the frontier between these disciplines and is based on:

  1. Developing new process-based model analysis metrics for societally relevant processes to improve data analysis and minimise data production;
  2. Developing a new reduced set of performance metrics to simplify and improve efficiency in data analysis;
  3. Exploring a new approach to multi-model ensembles to minimise data production.

Recent advances in big data analysis and design of experiments combined with the latest developments in international climate modelling initiatives, means now is the opportune time to develop a breakthrough in the way we produce and analyse climate model data.