The US CRN (Climate Reference Network) appears to be a generally well-designed network for measuring 21st century temperatures. Its mission statement includes an undertaking to make its results available online. Here as with GISS and the metadata, system designers have provided webpages – in a format that may be interesting for very casual users, but a nightmare for serious data processing. People interested in monthly data are apparently expected to cut and post innumerable monthly data sets.
I’ve written a short read function to scrape monthly CRN results by individual station at http://data.climateaudit.org/scripts/station/read.uscrn.txt . This uses the style of Nicholas’ function to scrape information from GISS.
You can look up station IDs either at http://mi3.ncdc.noaa.gov/mi3report/MISC/CRN-STATIONS.TXT (outdated but ASCII readable) or http://www.ncdc.noaa.gov/app/isis/stationlist?networkid=1 (up-to-date but inconvenient for searching. I’ve scraped a more usable form of this as well).
Because of the crummy CRN interface, you have to individually download every month to extract the monthly data. So it ends up taking a fairly long time to extract the tiny amount of data of interest. (I’ve written to Karl suggesting that they provide a sensible ASCII-based archive).
Here’s an example (also showing a download of GHCN daily data for Detroit Lakes for comparison) and comparing the Goodridge MN CRN site and the Detroit Lakes MN USHCN site, yielding the figure below:
goodridge=read.uscrn(id=1039) #Goodridge MN
detroit=read.ghcnd(usid=212142) #DEtroit LAkes
legend(2007.2,-10,fill=1:2,legend=c(“Goodridge CRN”,”Detroit L USHCN”),cex=.7)
title(main=”Comparing CRN and USHCN in Minnesota”)
plot( c(time(combine)),combine[,6]-combine[,10],ylim=c(0,3.75),xlim=c(2002,2008),type=”l”,ylab=”Deg C”,xlab=”")
BTW the NOAA robots.txt file – not that I agree that this is relevant to acquiring temperature data from a site that is supposed to provide temperature data – does not disallow access to the crn/ directory and subdirectories.