“FREEZING TROPICAL ISLANDS AND BOILING TOWNS”
The Audit could scarcely have come at a more embarrassing time for the IPCC who have just released its 2018 Summary for Policy Makers claiming that the global warming crisis is more urgent than ever. McLean’s audit strongly suggests that these claims are based on data that simply cannot be trusted. HadCRUT4 is the primary dataset used by the Intergovernmental Panel on Climate Change (IPCC) to make its dramatic claims about “man-made global warming”, to justify its demands for trillions of dollars to be spent on “combating climate change” and as the basis for the Paris Climate Accord.
This report makes more than 70 findings about areas of concern with the HadCRUT4 temperature dataset. These cover the entire process from the measurement of temperatures to the derivation of HadCRUT4 global average temperature anomalies. They relate to the inclusion of data that is obviously in error, inappropriate procedures, poor data processing and significant assumptions about a range of matters including basing conclusions on very little data. Most of the findings increase the uncertainty in the data and therefore increase the error margin. One however shows that a common but flawed method of data adjustment creates a false warming trend from the adjustments alone. Another finding points out that when stations were closed rather than relocated any distortion in the data remains in the record. Errors are also identified
in sea surface temperatures, including some created by a member of the team responsible for that data.
First ever audit of global temperature data finds freezing tropical islands, boiling towns, boats on land, by Joanne Nova.
The fate of the planet is at stake, but the key temperature data set used by climate models contains more than 70 different sorts of problems. Trillions of dollars have been spent because of predictions based on this data — yet even the most baby-basic quality control checks have not been done.
There are cases of tropical islands recording a monthly average of zero degrees — this is the mean of the daily highs and lows for the month. One site in Colombia recorded three months of over 80 degrees C. That is so incredibly hot that even the minimums there were probably hotter than the hottest day on Earth. In some cases boats on dry land seemingly recorded ocean temperatures from as far as 100 km inland. A spot in Romania spent one whole month averaging minus 45 degrees. The only explanation that could make sense is that Fahrenheit temperatures were mistaken for Celsius, and for the next seventy years at the CRU no one noticed.
Dr McLean audited the HadCrut4 global data from 1850 onwards for his PhD thesis, and then continued it on afterwards til it was complete:
“I was aghast to find that nothing was done to remove absurd values… the whole approach to the dataset’s creation is careless and amateur, about the standard of a first-year university student.”
His supervisor was Peter Ridd, famously sacked for saying that “the science was not being checked, tested or replicated” and for suggesting we might not be able to trust our institutions …
McLean’s findings show there is almost no quality control on this crucial data. The Hadley Met Centre team have not even analyzed this data with a tool as serious as a spell checker. Countries include “Venezuala”,” Hawaai”, and the “Republic of K” (also known as South Korea). One country is “Unknown” while other countries are not even countries – like “Alaska”. …
Adjustments Make the Past Cooler so the Warming Trend Appears Stronger:
In probably the worst systematic error, the past is rewritten in an attempt to correct for site moves. While some corrections are necessary, these adjustments are brutally sweeping. Thermometers do need to move, but corrections don’t have to treat old sites as if they were always surrounded by concrete and bricks.
New original sites are usually placed in good open sites. As the site “ages” buildings and roads appear nearby, and sometimes air conditioners, all artificially warming the site. So a replacement thermometer is opened in an open location nearby. Usually each separate national meteorology centre compares both sites for a while and figures out the temperature difference between them. Then they adjust the readings from the old locations down to match the new ones. The problem is that the algorithms also slice right back through the decades cooling all the older original readings — even readings that were probably taken when the site was just a paddock. In this way the historic past is rewritten to be colder than it really was, making recent warming look faster than it really was. Thousands of men and women trudged through snow, rain and mud to take temperatures that a computer “corrected” a century later. …
The first audit. Seriously, at this late stage?
As far as we can tell this key data has never been audited before. (What kind of audit would leave in these blatant errors?) Company finances get audited regularly but when global projections and billions of dollars are on the table climate scientists don’t care whether the data has undergone basic quality-control checks, or is consistent or even makes sense. …
UPDATE: Climate Bombshell: Global Warming Scare Is Based on ‘Careless and Amateur’ Data, Finds Audit, by James Delingpole.
Joanne Nova and I helped John Mclean set up Robert Boyle Publishing so we could publicize this important work. The major political problem with the whole carbon imbroglio was a lack of due diligence and no audits. Well finally, someone audited the main temperature dataset — and found it laughably poor and systematically biased. Oh dear.
The temperature data set has found it to be so riddled with errors that it is effectively useless.
HadCRUT4 is the primary dataset used by the Intergovernmental Panel on Climate Change (IPCC) to make its dramatic claims about “man-made global warming”, to justify its demands for trillions of dollars to be spent on “combating climate change” and as the basis for the Paris Climate Accord.
But according to a groundbreaking analysis by Australian researcher John McLean it’s far too sloppy to be taken seriously even by climate scientists, let alone a body as influential as the IPCC or by the governments of the world. …
As McLean says:
“Governments have had 25 years to check the data on which they’ve been spending billions of dollars. And they haven’t done so once.”
McLean is the Australian IT analyst who broke another scandal about the global warming scare: that it was effectively the creation of just 53 people. (Climategate)
Above link is the IPCC report limiting global warming to 1.5 C above pre-industrial levels.
Nobel Prize-Winning Stanford
University Physicist Dr. Robert B. Laughlin, who won the Nobel Prize for physics in
1998, and was formerly a research scientist at Lawrence Livermore National
“Please remain calm: The Earth will heal itself — Climate is beyond our power to control…Earth doesn’t care about governments or their legislation. You can’t find much actual global warming in present-day weather observations. Climate change is a matter of geologic time, something that the earth routinely does on its own without asking anyone’s permission or explaining itself”. —
SEA LEVEL RISE?
Click to enlarge Item from Tony Heller at https://realclimatescience.com/
Click on photo to enlarge Saudi Arabia $10 Billion investment in Maldives