Climate change is often communicated by looking at the global average temperature. But a global average might not mean much to the average person. How the climate is likely to change specifically where people live is, in most cases, a much more important consideration.
Carbon Brief has combined observed temperature changes with future climate model projections to show how the climate has changed up to present day, but also how it might change in the future for every different part of the world.
To do this, the world has been broken up into “grid cells” representing every degree latitude and every degree longitude. This results in 64,800 grid cells, which are typically about 100 kilometers wide. (In reality, they are a bit larger at the equator and smaller close to the poles.
The map overlay on the interactive above shows the amount of warming to expect in each grid cell based on future Representative Concentration Pathway (RCP) scenarios developed by climate scientists. These four scenarios represent different possible future emission trajectories. They range from the low-warming RCP2.6 scenario, which keeps global warming from the pre-industrial era to below 2C, up to a high-warming RCP8.5 scenario that would likely see global temperatures rise to above 4C.
Methodology and data sources
Temperatures based on land and ocean observations were obtained from the Berkeley Earth Surface Temperature Project’s one-degree latitude by one-degree longitude gridded monthly average temperature fields (note: large file download). These were converted into annual average temperature anomalies relative to a 1951-1980 baseline period.
These temperature estimates use observations from around 30,000 land monitoring stations, as well as thousands of ships, buoys and other monitoring systems over the ocean. Berkeley Earth uses the UK Met Office’s HadSST3 ocean temperature record as the basis for its ocean temperatures.
Observational data is available back to 1850, though for any given location data may not go back that far. Data is available from at least 1900 for most locations except Antarctica, where data is only available starting in 1950 when measurements on that continent began.
Berkeley Earth land data is homogenised – adjusted to correct for station moves, instrument changes, time of observation changes and other disruptions that stations have experienced over the past 150 years. Ocean temperature records are similarly adjusted to account for changes in the way ocean temperatures are measured, from buckets thrown over the side of ships through to engine-room intake valves and automated buoys in modern times.
These adjustments have a relatively small impact on temperatures after 1950, as discussed in the Carbon Brief explainer on temperature adjustments. The overall effect of adjustments is to increase temperatures globally prior to 1950, reducing the amount of long-term warming in the record compared to the raw readings.
Future temperature projections are taken from the Coupled Model Intercomparison Project 5 (CMIP5) multi-model average surface air temperature for each RCP scenario. CMIP5 features around 38 different climate models, though some of these represent variations of the same underlying model with different aspects included. One run from each model was used in calculating the multi-model average, with the model temperature fields obtained from KNMI Climate Explorer.
These multi-model average values are downscaled – increased in spatial resolution – to a one-degree latitude by one-degree longitude resolution to be comparable to the observations. They are converted into anomalies with respect to a 1951-1980 baseline, then aligned to the observations over the 20-year period from 1999-2018 to show the changes expected from present. Model data is shown between 2000 and 2100 in the sidebar for each grid cell.
Both observational temperature estimates and future projected temperature changes are subject to uncertainty. Observational uncertainties in historical temperature records from Berkeley Earth are shown in the sidebar.
Observational uncertainties can arise from a number of different factors. Incomplete coverage of observations across the Earth’s surface means that sometimes temperature anomalies in a location have to be estimates from nearby land stations or ocean measurements. The Berkeley Earth dataset uses a technique called “kriging” to create globally complete estimates of both temperature and uncertainty from observations at specific locations.
Future climate model projections also include significant uncertainties, chief among them the sensitivity of the climate to increased CO2. The CMIP5 models featured in the most recent IPCC reportestimates climate sensitivity at between 2.1C and 4.7C per doubling of atmospheric CO2 levels, with an average sensitivity of 3.1C. The multi-model average projections shown in the sidebar only reflect this 3.1C value; users interested in the results of individual models with higher or lower sensitivity will have to use a tool such as KNMI Climate Explorer to view those results.
Individual models also show a lot more year-to-year variability than the multi-model average shown in the sidebar. Individual models have short-term variability driven by factors including El Niño and La Niña events that result in some years warmer or cooler than others. However, this short-term variability occurs at different times in different models and is largely averaged out in the multi-model average.