From our friends at climateprediction.net, climate disaster has struck
I regret to announce that we’ve recently discovered a major error in one of the files used by the climate model. The file in question specifies levels of man-made sulphate emissions but due to a problem with the file specification, models have been inputting greatly reduced levels throughout their runtime. The consequence of this is that aerosols responsible for "global dimming" (cooling) are not present in sufficient amounts and models have tended to warm up too quickly. The file specification error is also responsible for causing models to crash in 2013 which is how we originally came across the problem.
Unfortunately, all the data returned to us so far has been affected by this problem. While the data is scientifically very useful, and will certainly form the basis of future research (it allows us to investigate the full effect of greenhouse gas emissions without global dimming), it doesn’t enable us to compare the models’ performance against real world observations of the 20th century since such an important component is missing. In order to do the experiment we intended, we unfortunately have no choice but to start models again from the beginning.
Now I know what you’re thinking – how can they possibly have not put some limit in the software before the Earth heats up too much? But these are the same people who think that the Earth heating up by 11oC in 40 years is a reasonable result. I’d love to know what’s "scientifically useful" about bad data from a known unphysical model.
So all of that time and energy that people have allowed for these models to run has been utterly wasted. We could have told them that.
In the words of The Inquirer
With around 200,000 PCs running the experiment non-stop for two months, it looks very much as if the BBC experiment is making more of a contribution to global warming than scientific knowledge.