The UK Met Office has made a proposal for collation of station data, reported here by Foxnews , proposal here. Many, if not most, aspects of the proposal are obviously ones that have advocated here for a long time. Indeed, last summer, well before the Climategate Letters became public, I’d even suggested that the Met Office take over responsibility for this data set from CRU.
-
Tip Jar
-
Pages
-
Categories
-
Articles
-
Blogroll
- Accuweather Blogs
- Andrew Revkin
- Anthony Watts
- Bishop Hill
- Bob Tisdale
- Dan Hughes
- David Stockwell
- Icecap
- Idsos
- James Annan
- Jeff Id
- Josh Halpern
- Judith Curry
- Keith Kloor
- Klimazweibel
- Lubos Motl
- Lucia's Blackboard
- Matt Briggs
- NASA GISS
- Nature Blogs
- RealClimate
- Roger Pielke Jr
- Roger Pielke Sr
- Roman M
- Science of Doom
- Tamino
- Warwick Hughes
- Watts Up With That
- William Connolley
- WordPress.com
- World Climate Report
-
Favorite posts
-
Links
-
Weblogs and resources
-
Archives
76 Comments
I am excited by “common trove of global temperature data that is open to public scrutiny and “rigorous” peer review.”
One hopes.
Will the data be raw or ‘adjusted’? Ideally both will be published, together with the actual programs used to make the adjustments.
That’s a great question and one that I want to know the answer to.
And I really hope they have the original unadjusted data available.
Looks like re-arranging the deckchairs on the Titanic to me – it’s still GIGO…
Step in the right direction I guess but it seems to be another piece in the choreographed “new dance of the 21st Century Climate Science”. Oh how times have changed etc etc. The Exec. Summary reads like a PR exercise on about robust this that and the other.
There isn’t any mention of what they are doing about actual stations, yet they are talking of finer than daily resolutions (I presume that is what they mean by “sub daily”?). Isn’t that going to rule out a chunk of the existing network?
From MR workplan:
“In making its analysis and conclusions, the Team will test the relevant work against pertinent standards at the time it was done, recognizing that such standards will have changed.”
Fits and starts are still a generic step upwards from stonewalling. So yay.
This seems a very good document from the Met Office, and the concept, interestingly suggested a while back by Steve, is very encouraging.
It does seem, in the current climate, so to speak, to put strong emphasis on unhindered accessibilty of the data, and I think importantly homogenization routines. Hopefully a more organized and available record than professor Jones’ might be established!
Could there be a potential for subversion by Government of such a highly centralized and effectively government bought scheme? I do see obvious advantages however in the centralization but I also see risks. Maybe just paranoia!
I would suggest that the subject is now too high profile for the black arts to be successfully practiced. For a start, deviations between data from the Met Office scheme and global satellite readings which are publicly available would have to be explained in detail. I also suggest that governments will be far more involved with standard setting and evaluation of data, apropos the Pharma Industry after Thalidomide and sundry other debacles.
I read this story at Bishop’s place and I will add the same comment here: the proposed network will obviously require a large amount of funding and therefore to my mind is just another pitch for even more of my taxes. Obviously I would be interested to hear why they think this is needed, given the absolute certainty with which they’ve promoted and sold their existing data.
If this is really re-done from the raw data and not from ‘value added’ data then I think it is well worth the expense.
But what’s the point of it, apart from adding further carriages to the already oversized AGW gravy train? I would be very wary of the same people, the same institutions, with the same political masters, the same agendas and the same opinions, generating another dataset.
I have to say that that was my initial impression. Nothing has yet changed my mind.
One might imagine that the train would be detained at a stop light for a while, funding for progress deferred. What better way to unlock the cash flow once again, with minimal new effort, than suggesting a repeat of what had gone before but with added eye candy?
You make it sound as if it could be organised like an open source software project. Hang on, that’s not a bad idea …
The obvious thing (to me at least) is to house the data and code in an online revision control system with automated QC tests.
This is how Firefox, Linux, etc. are developed. The code is browse-able by anyone. All changes are tracked. Code changes are tested against regression tests.
Of course, if you want to work with the data and code locally, you could simply pull everything to your machine locally and work with it. If you find a bug or improve an algorithm – you would get approval (from the community) and integrate it with the public version.
This works very well in the software world – and it is not expensive.
See for example, the way that Firefox is developed (http://hg.mozilla.org/mozilla-central/pushloghtml) and built (http://tinderbox.mozilla.org/showbuilds.cgi?tree=Firefox).
Re: ZT (Feb 23 19:15),
From here we can move toward Reproducible Research.
(Tech folk: “make clean; make”)
I am very much in favour of this Open Source approach. However, with Linux and Firefox, there is a common goal, namely to create good code.
Climate science, I am not so sure. Look at Wikipedia, e.g the page on the MWP
Sorry to all three that I only read this little section after my latest comment about open source. Five days behind the pace, that’s me! The Reproducible Research link is very helpful thanks, MrPete. I don’t at all think the problems are insurmountable, Andy. Among the things required are
1. A repository that is very easily ‘forked’ to give an alternative point of view (Git has emphasized this within the open source world recently but I’m talking more broadly). Then you’d be free in research to refer to the branch that you prefer – of course giving reasons.
2. Identities of contributors known and discoverable by anyone, for all changes.
The last one isn’t always true of open source projects – especially for someone new to the project (core committers are almost always known by each other).
Well what else can they do NOW. Their hide and seek plan was a dismal failure.
In the lack of official attribution, one can sense the emerging official public explanation that will be the final official statement concerning the official reconstitution of the global temperature record; to wit: ‘This is something we at the Met Office had always wanted to do, and had always intended to do when the resources became available. Now that the government has generously provided the means, we can proceed to the open-format data set that we had always wanted to provide.‘
Expect that no official acknowledgment will ever be forthcoming, that none of this would ever have happened except for Steve’s invariably calm, polite, relentlessly scientific, constructively critical, with highest integrity, and entirely altruistic effort in examination of the record.
Steve’s public effort, by way of Climateaudit, ultimately created the context where an email leak at CRU became an obvious moral good. This in turn caused the Met Office and CRU to open up and play fair. None of this would have taken place without Steve, or someone like him (in an alternative universe).
The Met Office owe him big time, and will never have the courage or integrity to admit it (IMO).
Agreed.
“It is amazing what you can accomplish, if you don’t care who gets the credit.” –Harry S Truman (tho there are a lot of variations of that thought attributed to others)
It was somebody else who said it first, obviously.
Sorry to be a skeptic, but I’m confident that the recipe that “will assure verifiability” will get perverted and suborned somewhere in the process. Trust me – it will.
This “contamination of science” battle is going to have to be won on a daily basis.
Maybe not, or at least not so completely. There’s a critical mass of people (pun intended) that visibly will hold the data compilers to more rigorous standards. But yes, it will be a continuing effort.
I’ll draw your collective attention to the treasure trove of “official” U.S.
location/temp data from 1861 – 1942 at:
http://docs.lib.noaa.gov/rescue/cso/data_rescue_signal_corps_annual_reports.html
Unless the bound hardcopy reports have been purged, the volumes with these
reports issued on an annual basis should be in official “depository” libraries.
They are usually tucked away in the Government Documents sections. I know they
used to be there at the Kent State University library (a designated “Depository”
library) when I was chasing weather data back in the 1970’s.
It should be possible, given the manpower and the storage space, to fill in
“cells” to generate a large geographic urban & rural grid relying exclusively
on these reports.
Sadly GISS has never acknowledged doing anything with this.
I doubt the MET will bother with this mass of professionally reported and
recorded information… since they’ll rely on GISS spawned reports.
Re: R.S.Brown (Feb 23 20:05), That’s great! Do the hardcopy files have more detail than the PDF reports you can pull up from that page? (the PDF reports appear to show annual temps for a variety of locations, but no monthly or daily record.)
Glenra,
Try jumping ahead to the pdf formatted report for 1977/78.
There are a number of monthly mean temps for the 200+
observation stations they had reporting (203 as reported
on pdf page 13 … there are more after July 1878 due to
more stations being created.
My semi-eidetic memory from about 35 years ago tells me
that a number of daily station reports were attached
to the main annual reports as addendum.
Many of these followed the 7:00am, 2:00pm, and 9:00pm
temperature reports on pdf page #429 (Report Page 424)
requeried for filling in their Form 22.
I’m not sure where the 7:30am and midnight wet/dry bulb
measurements were reported.
As you’ve probably noticed, a lot of stuff in the various
monthly reports are Army in nature, and deadly dull unless
you’re into geneology, accounting, or information
transmission via the earliest form of electronic
(telegraphic) web. As you might expect, everything was
backed up on paper in duplicate.
It will be a lot of fun running the crop reports in
parallel to the monthy U.S. reported sunspot group/spot
counts they included.
DUH !
That should be the 1877/1878 report.
My proof reading skills have declined the past couple
of years, along with my memory.
Sigh.
Glenra,
In the 1877/1878 report start on pdf page 233 (report page 228)for the daily max, min, mean temps for Albany, N.Y., followed by Alpena, Mich., going along to Yuma, Arizona (Territory) on pdf page 339, (report page 394).
As one who is skeptical of many claims made climate skeptics, I venture a guess that the improved data-sets will confirm what IPCC said in 2007: 20th century global warming is “unequivocal.” Also, I venture to predict that the current 21st century lull in global warming will come to an end, with temperatures increasing again, within the next few years.
That said, I would like to join with Steve et al. in celebrating an unambiguously positive development. Let the data take us where it may.
Curt, are you skeptical of the fact that climate models have no predictive value?
I thought that Gavin Schmidt has indicated that the recent gobal cooling was consistent with the model predictions.
Or in other words, the models are not wrong; they are useless.
Re: TAG (Feb 23 21:49), Although I don’t know, could these be new predictions adjusted after the cooling? Which would say quite a lot about the whole methodology. Something awkward happens and its back to the drawing board to take account of the new circumstances. Useless as you say.
I don’t agree that “climate models have no predictive value.” Actually several of the models (or their close relatives) are used routinely for weather forecasting. Of course weather forecasts are far from perfect, but they are better than the alternative of guesswork — even with experts doing the guessing. I suspect that a similar assessment will apply to the climate models when we are able to check their predictions a few decades ahead.
Meanwhile, I will try to avoid any final conclusions and focus on improving the data stream.
Curt, you know very well that weather forecasting models are updated regularly (about every hour), and couldn’t predict weather even a week in advance without updating.
With regard to GCMs, if they’re reliable, how was it that the HadCM3, for example, couldn’t pass a perfect model test?
By the way, Demetris Koutsoyiannis has already tested the predictions of 6 GCMs (3 TAR and 3 4AR models) at about 55 locations world-wide, based on their simulations of the global 20th century climate.
They failed.
It’s quite true we can’t predict the details of weather more than a week or so in advance. But can we predict the generalities of climate change years to decades in advance? That to me is the big question, not completely answered yet.
Thanks for the pointer to K’s work. I notice that their model evaluation criteria are based on fairly small areas — usually individual weather stations, and never bigger than “sub-continental.” Poor climate model performance at these spatial scales is an old story, and quite consistent with IPCC reports if you read the fine print (a bit of which I have written myself).
I must bow out of this discussion now and get back to work on a project with Dick Lindzen. Steve knows how to reach me if truly pressing inquiries arise.
Let teh data take us where it may and let us hope that no one will be trying to get rid of the MWP
Dear Dr. Covey,
I welcome your support for open data and methods, and hope you can encourage your colleagues in this approach.
Like you, I expect that a new data set will support “unequivocal” warming in the 20th Century, and probably cooling as well. In fact, I would not be surprised to find a new dataset would support the analysis Dr. Phil Jones made recently, that the warming in the period 1975-2009 was very similar to the warming in 1860-1880 and from 1910-1940.
meh, There’s nothing wrong with do-overs. Start here and move forward.
Remember Antarctica last winter. They only need a few stations on each continent. Then they can infill using that remarkable peer-reviewed Reggae program and we will have a grid full of data.
The Met Office proposal is the UN to call a workshop in which the key players would divide up the tasks to create an agreed upon ensemble of data sets. Isn’t this quite distinct from the way that the IPCC operates in summarizing existing academic papers. One could well see this as a rejection of the IPCC and a call for the creation of a new organization that would not act as a passive synthesizer but as an active director of a worldwide effort for the study of climate.
This could be a call for the end of the IPCC and the directing role of academics in climate research. This would be a very good thing to happen.
I would not bet any money on the IPCC going anywhere just yet. It has far too much money and far too many resources tied up in it for them to simply replace it now, no matter how flawed it is.
I do not see anything which clearly addresses the station siting problems disclosed by surfacestation.org nor do I see anything about standardization and calibration of equipment and recording methods/procedures.
Also, the entire focus is temperature. Why not record other variables that might have something to do with “climate change” such as relative humidity, dew point, atmospheric pressure, wind measurements, proximity to bodies of water, station altitude, etc.
Re: Dave L. (Feb 23 22:58),
I agree.
I reckon a lot could be achieved by publishing all the data and metadata, and letting the public in the localities give facts about stations to rule out useless and misleading ones. A lot already available at WUWT etc
It is unclear how far they are thinking of going, but this is a very good opportunity to upgrade the whole system with greater forethought. Other data (humidity, atmos pressure, wind speed/dir, dew point) would certainly give their monster computers something to chew on.
Coherent proper siting of long term rural stations could probably significantly reduce the total number of stations needed to give a more even global spread. Cost should be manageable and justifiable given what we are told is at stake.
You probably all know of the records posted by http://www.wunderground.com/wundermap/ (I can’t seem to make this an easy link – sorry; google them!). There are a lot of interested people out there, and one feels they might be quite interested in local scrutiny and checking what goes on at local ‘official’ stations.
Re: Dr Iain McQueen (Feb 23 23:49),
This suggestion has something of the open source component someone else mentioned purely on the data collection side. The homogenising routines would have to be separately controlled on the standard open source model. The advantage would be bringing bigger numbers of interested intelligent people into play, with their enthusiasm and considerable programming skills.
I am getting a bit fanciful! and know little about it.
I, for one, would rather see an Open Source Temp Record developed by people and not a government agency. I do not believe the Met Office will construct a computer interface to analyze the data in all the ways people might like to analyze the data.
That said, the fact the Met Office is taking this step is a good sign because it is a direct attack on the lack of openness practiced by the pseudoscientists in climate science. This is a stinging rebuke of Phil Jones.
The Data should be collected and held by:
The UK Statistics Authority is an independent body operating at arm’s length from Government as a non-ministerial department, directly accountable to Parliament. It was established on 1 April 2008 by the Statistics and Registration Service Act 2007.
http://www.statisticsauthority.gov.uk/
Re: MarkR (Feb 24 02:08), you cannot access the raw data held by the UK Office for National Statistics, for reasons of commercial and personal confidentiality. All published data has been heavily processed.
The basis for the program should be the capture and storage of ALL meteorological data in a database open to public use for analysis. Note that all of the available data should be recorded in the public record.
This record should have a Quality Assurance program and Quality Control processes to verify the validity and calibration of the data. The total program of data collection should be subject to international audit.
All processes used to correct faulty data should be open and transparent and available for review.
If we do this much the data will be available for analysis by anyone who wants to cherry pick his own data. The analysis of the data is an entirely different program than the collection and maintenance of the database. This should be clearly stated and it is not in the metproposal.pdf.
By the way, the word “robust” should be avoided in all these documents, since by now it is a special word for abused climate data.
My thoughts.
John Andrews, Knoxville, Tennessee
The Data should be collected and held by The UK Statistics Authority.
“The Data should be collected and held by The UK Statistics Authority”
Precisely: although I don’t think that the Office for National Statistics currently maintains any extra-UK data.
The is ample evidence that the CRU and the UK Met Office have been responsible for abysmal standards of data care in the past.
Presumably both realised this long ago and yet made no effort to remedy that situation.
Without new leadership and a total reorganisation neither can be trusted in the future.
Cheers and congratulations, SM, you have prevailed! A true Audit, indeed!!! 🙂
Steve, it seems that they have been listening to what you have to say. Congratulations, Ray
A few points at this time:
1. How long will it take to get the data out?
2. Will both raw and normalized data be made available?
3. Re: 2, will normalization algorithms be made public, “for rigorous peer review”?
I’m looking forward into taking a peek at the data myself..
-Sale
……….but what value do you put on openess and transparency when the process is unlikely to be inclusive????????
As we have seen in this debate exclusivity, as practiced by the experts, has destroyed public confidence in science.
We are all stakeholders in this process and no one has the right to determine how we should think and behave on the value of science.
If all the data is made available, then there is nothing to stop the open source community from creating and maintaining an open source alternative as with browsers and operating systems. In the future, the gov’s might even abandon their versions for the public version.
Given all of the issues with the data – basic accuracy, TOBS corrections, and just plain missing data, it will be interesting to see the magnitude of the “OS community reviewed” error bars.
I’m also pleased to see that all the NOAA/GISS/CRU instrumented temperature data auditing efforts of Steve Mc, Anthony Watts, ChiefIO, Jeff ID, hpx83, Verity Jones, NicL, Lucia, Tamino, Chad, Nick Stokes, Giorgio Gilestri to name only a few have finally paid off.
In particular I hope that once the global raw instrumented temperature dataset has been re-established, that full acknowledgement and appreciation is given to the significant warming and cooling trends of the 19th and 20th century.
I also hope that the current obsession with ‘anomalising’ and ‘gridding’ the data is dropped as IMO it is not necessary to anomlise and grid the raw data to see the clear cyclic warming and cooling trends of the past two centuries.
I also hope that something is done about the post 1990 ‘station drop out’ issue i.e. that the data is brought up to date, and that the ‘missing months issue’ is also dealt with in a much more appropriate manner. I also hope that all available temperature data for the rest of the world is found and added to the dataset not just data post 1950 as at present in GHCN. I don’t believe for one minute that the sudden increasing in reporting station post 1950 was due to the construction of airports world wide due to expansion in the aviation industry.
It’s only after the latter three issues highlighted above are dealt with properly that IMO we will then be able to properly assess whether or not the late (1970 to 2000) 20th century warming trend was any more significant than the 1910 to 1940 warming trend.
In doing so (i.e. ensuring that we can can compare the two warming periods correctly) due allowance must be made for the effects of station moves, instrument changes, land use changes and urbanisation (UHI effect). It is not IMO sufficient to apply an ‘algorithm’ (as done at present by NOAA/GISS/CRU) to attempt to allow for these changes over time (i.e. to ‘homogenise’ the data), but rather it is vital that sufficient meta-data be collected for each and every individual station (as done in the surfacestations.org project) which can then be used make due allowances the above highlighted issues on an INDIVIDUAL station basis.
As a UK taxpayer I’ll be happy to fund and even participate in this process free of charge, provided it adheres to the conditions I’ve expressed above. Regardless of the costs of this project, it will be small beer in comparison to the costs we are currently spending on funding organisations like the Tyndall Centre to look at masures to mitigate and adapt to supposed man-caused climate change when in fact at this point we are far from certain as to whether or not man has actually caused the problem and for that matter whether it is even a problem at all.
Pay no attention to the man behind the curtain. Ignore the old data, missing software, and old reports. Look at this shiny new vapourware, which will be wonderful although it has not been created and no papers depend upon it.
Call me a skeptic, but I think that some of the above celebrations and congratulations are premature.
In the document there is no mention of raw data.
And there is no mention of ensuring station quality.
The two main issues have been ignored.
Furthermore, the people writing this document have already decided in advance what the results will be (“it is important to emphasize that we do not anticipate any substantial changes in the resulting global and continental-scale multi-decadal trends”).
No mention in the proposal of station meta-data on siting, moves, modifications, maintenance, etc. Also no mention of archiving original data collection forms and historical photographic evidence of microsite biases. And finally no mention of testing of temperature stations for microsite and UHI biases. Without these the quality of the new database will be in as much doubt as the current one(s).
If i remember correctly that there is a private weather organization in the UK that has a very good prediction record compared to Met or CRU. They should be given the contracts to build and prepare the data base since i would not trust either of the others.
Lofty goals. Here’s a nice summary of how some code rewrites have gone in the past. Given how many people/organizations are involved, I’d say get ready for this about the time the Himalayan Glaciers melt.
http://www.joelonsoftware.com/articles/fog0000000069.html
I think this whole business of trying to determine a “global temperature” is unattainable. We ought to focus on a global temperature index in the future. By that I mean selecting 1000 or 2000 points scattered about the globe and calculating their mean each month. I think one or 2 thousand measurement points would be representative, and would give us a good indication as to what the globe is doing.
Instead saying the global temperature went up last month, we ought to say that the global temperature index indicated an increase.
Clearly these games currently played by some institutes have only created a big mess thus far.
While we all can see some positive intiatives in the proposal, at this point in time it is merely a proposal.
In the preface of the remarks it is stated that little change in the data and conclusions drawn from those data are expected. The authors need for that prefacing is a bit disconcerting.
Also the reference to the three major temperature data sets and the apparent agreement bothers me when no reference is made to the fact that they all use the same raw data inputs.
My major issues with these data set owners is that they are not motivated to show differences (and statistically significant ones) that exist in regions of the globe and over various time periods.
What is required is for all the data and methods used to be made public and then let the public/interested parties do what they may about them. We are much more likely to obtain reasonable assessments of the data from independent sources – and that is, of course, the primary argument all along.
I personally would like to see analyses that give a better estimate of the overall uncertainties (CIs)of the trends that are calculated from these temperature data sets.
Steve:
You should write up a short press release, immediately, and send it to the various media outlets. Your PR campaign is weak. You are continually libelled, maligned, and represented. Every time the climate science community retreats to one of your positions, it’s an opportunity for you to say:
“I’m glad that, after six years, they finally agreed to accept my position. Ensuring that science critical to public policy is reproducible has been the focus of my work. Our understanding of climate history would certainly have been on a much firmer foundation, today, had climate scientists taken my proposals seriously. Instead of agreeing to constructive change, a number of scientists decided to waste much of their time misdirecting the debate by intentionally misrepresenting my actions and positions to their scientific colleagues and the press. I welcome the change in direction, but note that this mechanism will work better if [Policy X] is put in place.”
Nobody else is going to rehabilitate your reputation. The Team could adopt twenty of your positions, but nobody in the press knows the back story. Their take will be: “Unlike politicians, these wonderful scientists address their problems when they come to their attention. They probably could have done this years ago, if nutjob-deniers like Steve McIntyre hadn’t been wasting their time with countless FOI requests.”
This is a slow pitch across the plate; like a pitch in baseball, you only get to swing at the ball while it’s over the plate […that is, while the story is in the news cycle.]
From Fox:-
“150 climate scientists in the quiet Turkish seaside resort of Antalya, representatives of the weather office (known in Britain as the Met Office)”
I hear Blackpool is a bit of a bummer at this time of year but this is what I call a jolly!! I wonder whose paying the carbon credits.
Curt, you wrote, “It’s quite true we can’t predict the details of weather more than a week or so in advance. But can we predict the generalities of climate change years to decades in advance? That to me is the big question, not yet answered definitively.”
If it’s true that weather models can’t predict more than a week in advance without updating, then what was the point of you citing them as examples of GCM-like models that work?
Clearly if weather models work only because of updating, then their success provides no reassuring analogy for climate models that cannot be updated.
Likewise, if GCMs are like weather models, and if weather models need constant updating to predict future weather a week in advance, then why isn’t it a good inference that similarly constructed climate models, unable to be updated, will not be reliable in predicting future climate?
Demetris’ second work also looked at continental scale predictions across the US. The models yielded poor results there, too.
I have looked into some of the details of the IPCC reports. The simulation errors of GCMs, as discussed in the Supplementary Chapters of the reports, clearly show that GCMs cannot accurately reproduce the energy fluxes of the climate.
You know those error magnitudes. As a good scientist, I can’t imagine how you can possibly conceive that those errors all cancel out, or that the models can be accurate despite energy flux errors far larger than the energetic consequences of increased CO2.
Finally, if it has not yet been “answered definitively” that GCMs can predict future climate, then what possible rational or ethical basis is there to stoke hysterical worry about future warming?
Scientists such as you should be stepping forward to repudiate the future climate certainties claimed by the IPCC and others.
Honestly, I plain don’t understand how you can possibly rationalize your silence in terms of professional ethics.
not sure the Met Office can do this. Here is extract from their ” Values and Principles”.
Click to access values_principles.pdf
• Complying with applicable environmental legislation and other requirements as appropriate to our business.
• Committing to the prevention of pollution and reduction of like-for-like CO2 emissions (including travel).
• Committing to continual improvement by minimising the
environmental aspects associated with all activities, products and
services of the business. As part of this the Met Office shall
implement a comprehensive sustainable purchasing policy.
• Encouraging environmental awareness among those working for,
or on behalf of, the Met Office, through effective training
and communication.
• Promoting the effective use of resources by encouraging recycling
and the re-use of materials.
• Adopting environmental best practice relevant to our business
This shows the culture of the Met Office will not be open to anything that challenges these values.
The next extract is a little ironic!
And :- • Respect our customers and treat them honestly and with integrity. We will give full information about products and services and honour
our promises. We will respond effectively to complaints.
• Respect and protect customer information, in line with the
Data Protection Act, and respond effectively to requests in
accordance with the Freedom of Information Act.
Cherries, anyone? Another data “trick” in Australia
Ken Stewart has done some great work with the raw data from Australia- yet another ‘smoking gun”…
http://kenskingdom.wordpress.com/category/uncategorized/
Why not make a suitable start and ensure that at EACH temperature measuring location, the temperature is recorded by TWO entirely different and independent methods situated in separate housings. That should get rid of “dodgy” data from any assigned location.
Then, because we are interested from the viewpoint of real science, Dave L. and Dr Iain McQueen have really hit the nail on the head by suggesting that just a temperature reading is, in scientific terms, totally inadequate since by the laws of physics, as we understand them, other attributes conspire to affect temperature.
Should we not also record and correlate whether and/or how long the particular sensing point has been subject to cloud cover, inclement weather or unadulterated sunshine? There are so many physical phenomena which affect temperature.
If we do the job properly, there will be no need to ‘frig’ any analysing computer programme (Processing “tricks” are out-and-out fiddled frigging!!).
But after all of that, does it really matter?
The planet’s climate has gone through cyclic changes for millions of years and I have no doubt it will continue long into the future when the present human race becomes extinct. Are we so convinced of our own immortality that we believe that we can stop the tides? Just because we think we have gained a few more (disputed) drops of scientific knowledge.
You may disagree with my premise but please tell me if you do think that esconcing people in sponsored organisations to find out whether it is going to snow or be sunny is a greater contribution to mankind than developing methods for feeding and quenching the thirst of a starving world – and I am far from being a paragon of virtue.
From the UK “Daily Telegraph”
http://www.telegraph.co.uk/earth/earthnews/7309688/Met-Office-to-look-again-at-global-warming-records.html
“The new analysis, that will take three years, will not only provide a more detailed picture of global warming but boost public confidence in the science of climate change.”
“Vicky Pope, Head of Climate Change Advice, at the Met Office, said the new global temperature analyses would not change the trend of global warming.”
So they have decided the result of this “new analysis” before it has even started.
No wonder public confidence in climate “science” is at an all-time low.
This is good, because it is well known among climate scientists that the average world temperature rise, based on the data used up until now, is an UNDERESTIMATE.
http://climateprogress.org/2010/02/25/met-office-re-examine-of-climate-data-temperature-record/#more-19990
Despite lip service to data transparency, the Met’s uncritical endorsement of “homogenization”–i.e., the transformation of measured values into manufactured ones–troubles me greatly. As long as the liberties taken with raw data are not curtailed, we will not know what the intruments are actually telling us. I propose that this whole process be called “pasteurization,” because that’s where they’ve been cooking the books via patent trend-management techniques.
5 Trackbacks
[…] CA covers it here. […]
[…] https://climateaudit.org/2010/02/23/met-office-proposes-verifiable-temperature-data-set/ […]
[…] brighter news Met Office proposes verifiable data set. Sounds good but we still have climate crooks running the show so of course I am skeptical. Theres […]
[…] another stunning vindication of Steve McIntyre, the Met Dept are proposing to take over global temperature data from the CRU. Steve has of course been railing for years about the sloppy, good old boys science in Jones’ […]
[…] Office want to put in place a collation of station data that is publicly accessible. As reported in CA and FoxNews, with the proposal here it looks like the Met Office wants to make a central repository […]