It is “abundantly clear” that the Met Office cannot scientifically claim to know the current average temperature of the U.K. to a hundredth of a degree centigrade, given that it is using data that has a margin of error of up to 2.5°C, notes the climate journalist Paul Homewood. His comments follow recent disclosures in the Daily Sceptic that nearly eight out of ten of the Met’s 380 measuring stations come with official ‘uncertainties’ of between 2-5°C. In addition, given the poor siting of the stations now and possibly in the past, the Met Office has no means of knowing whether it is comparing like with like when it publishes temperature trends going back to 1884.
There are five classes of measuring stations identified by the World Meteorological Office (WMO). Classes 4 and 5 come with uncertainties of 2°C and 5°C respectively and account for an astonishing 77% of the Met Office station total. Class 3 has an uncertainty rating of 1°C and accounts for another 8.4% of the total. The Class ratings identify potential corruptions in recordings caused by both human and natural involvement. Homewood calculates that the average uncertainty across the entire database is 2.5°C. In the graph below, he then calculates the range of annual U.K. temperatures going back to 2010 incorporating the margins of error.
The blue blocks show the annual temperature announced by the Met Office, while the red bars take account of the WMO uncertainties. It is highly unlikely that the red bars show the more accurate temperature, and there is much evidence to suggest temperatures are nearer the blue trend. But the point of the exercise is to note that the Met Office, in the interests of scientific exactitude, should disclose what could be large measurement inaccuracies. This is particularly important when it is making highly politicised statements using rising temperatures to promote the Net Zero fantasy. As Homewood observes, the Met Office “cannot say with any degree of scientific certainty that the last two years were the warmest on record, nor quantify how much, if any, the climate has warmed since 1884”.
The U.K. figures are of course an important component of the Met Office’s global temperature dataset known as HadCRUT. As we noted recently, there is ongoing concern about the accuracy of HadCRUT with large retrospective adjustments of warming in recent times and cooling further back in the record. In fact, this concern has been ongoing for some time. The late Christopher Booker was a great champion of climate scepticism and in February 2015 he suggested that the “fiddling” with temperature data “is the biggest science scandal ever”. Writing in the Telegraph, he noted: “When future generations look back on the global warming scare of the past 30 years, nothing will shock them more than the extent to which official temperatures records – on which the entire panic rested – were systematically ‘adjusted’ to show the Earth as having warmed more than the actual data justified.”
Since that time, the Met Office has made further adjustments to its HadCRUT database and the effect can be seen in the graph below showing all the retrospective changes made since 2008.
HadCRUT is not the only global database to add warming and cooling in ways that conveniently emphasise the ‘hockey stick’ nature of recent temperature trends. The excellent climate4you site provides the graph above as well as the illustration below showing what is going on at NASA’s GISS database.
Even more than HadCRUT, large amounts of cooling has been added in the first 100 years of the record. It is noted that the net effect of adjustments made since 2008 is to generate a more smoothly increasing global temperatures since 1880. Compiler Emeritus Professor Ole Humlum concludes that “a temperature record that keeps on changing the past hardly can qualify as being correct”. Booker was more direct in his criticism, charging that that the “wholesale manipulation” of the official temperature record, “has become the real elephant in the room of the greatest and most costly scare the world has known”.
The Met Office does a good job in its core business of providing weather forecasting services. Helped by modern satellites and computers, it provides vital and improving information for the general public and specific groups like aviators, farmers, event organisers and the miliary. But its self-ordained Net Zero political role does it few favours. Using data, accurate enough to plan on sowing wheat, to warn of a climate crisis measured down to one hundredth of a degree centigrade is ridiculous, as our and other recent investigations have shown.
It recently proposed to ditch the scientific method of calculating temperature trends over at least 30 years in favour of just a decade of past data merged with future 10 year computer model projections. The reason for this was to enable it to quickly point to the passing of the 1.5°C target. Pure politics is behind this move since the 1.5°C warming from the pre-industrial age is a political fear mongering mark used to focus global efforts to push the Net Zero collectivisation. Professor Richard Betts, the Head of Climate Impacts at the Met Office, admitted as much when he said breaching 1.5°C would “trigger questions” about what needed to be done to meet the goals of the 2016 Paris climate agreement. Professor Betts later took exception to our coverage of his novel idea which was published three weeks after it was announced. “Are they just slow readers? he asked. “I suppose our paper does use big words like ‘temperature’ so maybe they had to get grown-ups to help,” he added.
In reply, it is accepted that the Met Office knows what an air temperature is. It just ought to get better at taking accurate recordings of them.
Chris Morrison is the Daily Sceptic’s Environment Editor
To join in with the discussion please make a donation to The Daily Sceptic.
Profanity and abuse will be removed and may lead to a permanent ban.