In the last few days, NASA has been forced to withdraw erroneous October temperature data. The NASA GISS site is down, but NASA spokesman Gavin Schmidt said at their blog outlet that “The processing algorithm worked fine.”
Schmidt blamed the failure on defects in a product from a NASA supplier and expressed irritation that NASA should bear any responsibility for defects attributable to a supplier:
I’m finding this continued tone of mock outrage a little tiresome. The errors are in the file ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v2/v2.mean.Z, not in the GISTEMP code (and by the way, the GISTEMP effort has nothing to do with me personally). The processing algorithm worked fine.
Although NASA blamed the error on their supplier (GHCN), in previous publications by Hansen et al, NASA had asserted that their supplier carried out “extensive quality control”:
The GHCN data have undergone extensive quality control, as described by Peterson et al. [1998c].
and that NASA (GISS) carried out their own quality control and verification of near real-time data:
Our analysis programs that ingest GHCN data include data quality checks that were developed for our earlier analysis of MCDW data. Retention of our own quality control checks is useful to guard against inadvertent errors in data transfer and processing, verification of any added near-real-time data, and testing of that portion of the GHCN data (specifically the United States Historical Climatology Network data) that was not screened by Peterson et al. [1998c].
Schmidt said that no one at NASA was even employed on a full-time basis to carry out quality control for the the widely used GISS temperature estimates
Current staffing from the GISTEMP analysis is about 0.25 FTE on an annualised basis (i’d estimate – it is not a specifically funded GISS activity).
Schmidt said that independent quality control would require a budget increase of about $500,000. NASA supporters called on critics to send personal checks to NASA to help them improve their quality.
At Verhojansk station, which I selected at random from the problem Russian statements, average October 2008 temperature was reported by NASA as 0.0 degrees. This was nearly 8 deg C higher than the previous October record (-7.9 deg). Contrary to the NASA spokesman’s claims, their quality control algorithm did not work “fine”.
What is more worrying is that no one seems to be minding the store. Schmidt says that the entire effort only takes about 1/4 of a man-year annually. (They are pretty busy at conferences, I guess.) CA readers know that the GISTEMP program is a complete mess and needs to be re-written from scratch. Schmidt seems not to even want to bother doing the work at NASA, saying that he’d prefer to hire ice sheet modelers and cloud parameterizers. He called on NOAA to do the job properly:
Those jobs are better done at NOAA who have a specific mandate from Congress to do these things.
On this point, I agree with NASA spokesman Schmidt. If NASA is not going to do the job properly, then it shouldn’t do the job at all. NASA should not be depending on the kindness of strangers for their quality control. Ross McKitrick has long observed that the collection of temperature data is a job sort of like making a Consumer Price Index and it should be done by professionals of the same sort. It doesn’t make any sense for people like James Hansen and Phil Jones to be trying to do this on a part-time basis. As long as it’s being done on such a haphazard basis, there’s really no way to prevent incidents like this one (or last year’s “Y2K” problem.)