Why did Australia warm more than the rest of the world in 2013?
But with the Glasgow climate summit seeing many countries including Australia promise they'll slash their CO2 emissions in the next 10 to 50 years to curtail climate warming, it's not a bad time to investigate what happened to Australian temperatures in 2013.
Let's start with a couple of charts showing the frequency of rounded .0F/C observations and annual temperatures at the 58 long-term weather stations that were operating in 1910 within the BoM's Australian Climate Observation Reference Network (ACORN).
The charts demonstrate temperature shifts coinciding with 1972 metrication and AWS installation mostly in the 1990s. as well as a clear and significant shift since 2013.
That shift is evident if you compare ACORN 2.1 land temperature measurements with the lower troposphere satellite readings for Australia compiled by the University of Alabama in Huntsville (UAH).
ACORN 2.1 and UAH temperatures parted company in 2013.
Furthermore, ACORN 2.1 parted company with 10 global temperature datasets published by UAH, the Hadley Centre, Berkely University and NASA, including sea surface temperatures (with detailed analysis available on a web page titled Is Australia warming more rapidly than the rest of the world?.
In the first chart comparing ACORN and UAH, you might notice a separation just before 2013. This is confirmed if you look at the data itself during 2012.
In Jan-Jun 2012 there was a 0.24C difference between ACORN 2.1 and the satellites. In Jul-Dec 2012 that difference jumped to 0.89C, and it's more or less stayed there ever since.
Okay, so unless we assume that all the other temperature datasets on earth went off the rails in 2012/13, there's cause to be suspicious about the accuracy of Australia's readings under ACORN 2.1 (which are the ones supplied to all the other global temperature indexes including the Intergovernmental Panel on Climate Change).
Maybe we can follow the breadcrumbs by comparing the distribution of daily temperature decimals logged at the 58 long-term ACORN stations in 2013-2019 with the preceding seven years of 2006-2012.
Do you notice that the frequency of .0 maximum recordings declined by 10% and the frequency of .0 minimum recordings declined by 11.5%, but the frequency of all other decimals was more or less unchanged except for a slight increase to soak up the reduction in .0 recordings?
There may as well have been a permanent drought across Australia in 2013-2019 that caused temperatures to soar, but this has nothing to do with decimal distributions and a reduction in the occurrence of .0 recordings close to 11%.
Just because it's more than 90C, 100C or 110C, that doesn't mean an AWS will for some reason begin logging .0 less than the other nine decimals.
It suggests a system change of some sort, maybe in the calibration of AWS instrumentation or how their one second readings are averaged to calculate maxima and minima when they're logged and electronically relayed to the BoM from ACORN stations.
However, there should be caution because .0C frequency has been progressively declining since 1973 - albeit with a BoM-acknowledged AWS error that inflated .0 decimal numbers from 1997 to 2004. You can see the significant bump caused by those errors in the rounded decimal trend lines of the two graphics at the beginning of this post.
The .0C frequency has declined from 22.1% of all decimals in 1973 to 12.3% in 2019. That in itself should raise questions about the decimal distribution influence on temperature trends since 1973.
Was it rainfall?
It's important to note here that although Australia didn't have a permanent drought, there was a reduction in 2013-2019 rainfall culminating with Australia's driest ever year in 2019.
Average annual rainfall at the 58 long-term ACORN weather stations was 725.3mm in 2006-2012 and 619.5mm in 2013-2019, a 14.6% reduction.
Again, rainfall levels should have nothing to do with decimal distribution but it's worth comparing a few individual years in the two different timeframes.
Average rainfall in 2007 at the 58 stations was 674.7mm and in 2013 it was very similar at 679.4mm. Their average maximum in 2007 was 25.75C and in 2013 it was 26.31C, up 0.56C.
The average minimum in 2007 at the 58 stations was 14.17C and in 2013 it was 14.30C, up 0.13C.
Minima increased because the slight rise in rainfall from 2007 to 2013 brought a bit more overnight cloud cover to trap heat, but maxima soared despite the small increase in rainclouds.
Alternatively, 2006 had an average 550.8mm of rainfall at the 58 stations, which averaged 25.67C for maxima and 13.70C for minima. This can be compared to 2017 with 646.2mm average rainfall and the stations averaging 26.27C for maxima and 14.20C for minima.
So despite 17.3% more rainfall, average maxima at the stations increased 0.6C and average minima increased 0.5C.
The 14.6% rainfall decline at these stations between 2006-12 and 2013-19 might explain the average 0.66C maximum warming, but why has minima also increased (by 0.38C) when there should have been more clear night skies without the rainclouds trapping heat?
Diurnal temperature range
Indications of an equipment or maximum/minimum systemic averaging change at automatic weather stations since 2013 are also visible in the diurnal temperature range (DTR, the difference between average maximum and average minimum) of the 104 non-urban ACORN stations used to calculate Australian average temperatures.
The chart above, using unadjusted RAW max/min temperatures, shows four DTR breakpoints from 1940, 1996 coinciding with the advent of AWS, and 2012 coinciding with the July 2012 increase in Australian mean temperatures compared to UAH.
The chart below uses the BoM's DTR calculations based on adjusted anomalies (see time series data at BoM site), with significant historic differences to unadjusted RAW min/max but a similar 0.52C increase comparing 1996-2011 with 2012-2019.
The chart and table below compare DTR based on unadjusted RAW min/max with the average annual number of rainfall days at the 104 ACORN stations from 1979 to 2019, with rainfall data inverted to enable easier comparison.
The chart and table above suggest a 0.20C DTR increase from 1979-1995 to 1996-2011 with increased rainfall, and a 0.47C DTR increase from 1996-2011 to 2012-2019 with decreased rainfall.
All of this can be related to a letter sent by climate researcher Jennifer Marohasy to Australia's Chief Scientist Alan Finkel in 2018 in which she wrote:
The original IT system for averaging the one-second readings from the electronic probes was put in place by Almos Pty Ltd, who had done similar work for the Indian, Kuwaiti, Swiss and other meteorological offices. The software in the Almos setup (running on the computer within the on-site shelter) computed the one-minute average (together with other statistics). This data was then sent to what was known as a MetConsole (the computer server software), which then displayed the data, and further processed the data into ‘Synop’, ‘Metar’, ‘Climat’ formats. This system was compliant with World Meteorological Organisation (WMO) and the International Civil Aviation Organisation (ICAO) standards. The maximum daily temperature for each location was recorded as the highest one-minute average for that day.
This was the situation until at least 2011 – I have this on good advice from a previous Bureau employee. It is likely to have been the situation through until perhaps February 2013 when —— from the Bureau wrote to a colleague of mine, ——, explaining that the one-second readings from the automatic weather station at Sydney Botanical Gardens were numerically-averaged. At some point over the last five years, however, this system has been disbanded. All, or most, of the automatic weather stations, now stream data from the electronic probes directly to the Bureau’s own software. This could be an acceptable situation, except that the Bureau no-longer averages the one-second readings over a one-minute period.
Indeed, it could be concluded that the current system is likely to generate new record hot days for the same weather – because of the increased sensitivity of the measuring equipment and the absence of any averaging/smoothing. To be clear, the highest one-second spot reading is now recorded as the maximum temperature for that day at the 563 automatic weather stations across Australia that are measuring surface air temperatures.
The comparison of ACORN 2.1 with UAH and other international datasets, as well as decimal distributions from 2006-2012 to 2013-2019, suggest the the timing of the change "until at least 2011" was probably from July 2012.
Suspicious step changes
Having shone a spotlight on the instrument and observer influence of 1972 metrication, AWS installations in the 1990s and something that happened in 2013, consider the Australian temperature step shifts in the two charts at the top of this post.
Climate change and temperature trends, with or without CO2 increases, should be gradual and don't just suddenly happen like that. The immediacy of the shifts suggests changes to equipment, observations, calibrations or AWS averaging systems, not the climate.
Australia committed to 2050 net zero emissions at Glasgow because our temperatures are believed to have soared since 1910, so the next and final article will be the most comprehensive analysis yet of average temperatures and rainfall since the 1800s.
Here's a preview … the BoM claims Australia's mean adjusted temperature at 104 non-urban ACORN stations has increased by 1.44C since 1910, but historic datasets suggest the mean unadjusted temperature at 225 weather stations has increased by 0.6C since before 1950.
Note : Annual Fahrenheit and Celsius decimal counts for minima and maxima from 1910 to 2019 at the 58 long-term ACORN stations are contained in an Excel file that can be downloaded here.
Note : The accuracy of pre-metric .0F decimal estimations in this post are validated through comparison with a 2001 PhD thesis titled Extreme Temperature Events in Australia by the BoM's Blair Trewin in which he calculates the .0F proportion of all observations at 94 ACORN weather stations from 1957 to 1971, presumably using original digitised Fahrenheit observations held by the bureau. A study extract of his calculations can be viewed here, showing that 51.5% of the observations were .0F. His calculations can be compared with the decimal calculation formulas used in this post here, showing they average 0.F proportions at the 94 stations from 1957 to 1971 at 51.3%, compared to 51.5% calculated in the Trewin thesis.
This website produced by Scribeworks 2008-2022