Underground Problems with Mann-Holes

Willis writes in:

While researching ocean drill cores at the WCDC, I stumbled across Mann’s borehole data. One of the proxies used for historical temperature reconstruction is "borehole temperature", the temperature down in the ground. In 2002, Michael Mann et al. published a study called Optimal surface temperature reconstructions using terrestrial borehole data. It is available here, for $9.00. In it they use all of the available weapons to construct the temperature proxy “¢’‚¬? EOFs, PCA, and of course that perennial favorite, "optimal fingerprinting", viz:

We employ a spatial signal detection approach that bears a loose relationship with “‹Å“”‹Å“optimal detection” approaches used in anthropogenic climate signal fingerprinting [Mitchell et al., 2001]. In such “‹Å“”‹Å“optimal detection” approaches, one seeks to identify, through generalized linear regression, the estimate of a target signal (as predicted by a model) in empirical data. Detection is accomplished through rotation of the empirical data, in EOF state-space, away from the direction of maximal noise (as estimated from, e.g., a control model simulation).

In our approach, an independent estimate of noise is not available. Rather, we employ an EOF rotation of the information in the borehole dataset toward an independent estimate of the target spatial SAT signal from the instrumental record, based on ordinary (potentially weighted) least squares spatial regression. Once an optimal rotation is found that provides maximal (and statistically significant) agreement between the spatial information in the borehole and instrumental record during the 20th century, the associated eigenvector rotation is used to project the estimated borehole SAT signal back in time.

So I decided to see how well this "optimal detection" works compared to plain old linear regression. I regressed several of their results against the HadCRUT3 temperature dataset. I compared my own area-weighted (cos(latitude) ) average of their raw gridded data, their calculated area-weighted average of their "optimal" gridded data, and their final "optimal reconstruction". Here are the results:

A couple things of note. First, the difference between just regressing the plain old raw average and their "optimal" result amounts to 0.06 degrees five hundred years ago … seems like they wasted a lot of time checking for fingerprints.

Second, this is as boring a paleo record as I could imagine, very little detail.

Finally, the lack of detail pertains to a very interesting sleight-of-hand manoeuvre … notice the kinks in the borehole reconstructions every hundred years in the graph above? Those kinks are present in Mann’s paper as well, only he’s hidden them by squishing the graphs down flat. But once you know where to look, you can still see them. Take a look … the black arrows show the big ones. The 1900 kink is the most prominent one, but they’ve made it so flat that even it’s hard to see.

Why are there kinks in the reconstruction? Because they’ve infilled the data. In each grid cell, there are actually only six data points, one each for the years 1500, 1600, 1700, 1800, 1900, and 1980. The other 474 data points are just linearly filled in between those six points.

Now, consider the true statistical significance of their data. Any average temperature will only have six data points … do they have a significant trend? No way, including autocorrelation you’re down to only about three data points, that’s one degree of freedom …

None of their results have any statistical significance whatsoever. For the period 1900-1980, for example, they are regressing two data points (or EOFing and PCAing two data points) in each grid against the actual temperature.

Grrrr … the Mannomatic truly can chop, slice, and dice anything. Most likely, they’re now using the data from this "reconstruction" in their latest hockeystick emulation …

Questions? The data is here, read’em and weep …

Update:
Since there’s only six data points per gridcell, we really don’t have enough data to say if there is a temperature signal or not. Not one of them has a statistically significant trend, either up or down, there’s simply not enough data.

Here’s a sample of a few of the boreholes …

These just happen to be the first ten gridcells in the dataset. Notice that each one only has six data points. Now, do you see a temperature signal in there?

I don’t think there’s much of one, in part because they are all so radically different. Look at gridcell 3, it has cooled 2 degrees over the period of the record … is that a temperature signal?

I also don’t think there’s a temperature signal there because of the monotonic nature of the signal, increasing every century. No other proxy dataset (including Mann’s ill-fated "hockeystick") shows a rise every century. All of them are coldest about 1700. So no, I don’t think there’s a discernable temperature signal there.

My graphs are just graphs, not sleight of hand. Mann never mentions in his paper that there are only six temperatures per gridcell, one per century, and he squeezed down his graphs until nobody notices. That’s sleight of hand …

79 Comments

  1. Dave Dardinger
    Posted Oct 4, 2006 at 10:18 AM | Permalink

    Well, boreholes are a rather coarse proxy anyway. Any number of things (like flowing water, nearby intrusive magma, cracks, etc.) can cause non-signal info to be imprinted on the temperatures in the hole. Therefore, if I recall correctly, the data is rather massaged to produce a temperature profile. I’m not sure, from a statistical viewpoint if this makes the final century-averaged data more or less accurate than just calling it one point would indicate.

    By that I mean that if you have taken temperatures at many, many points down the borehole and then applied analysis as to what the likely temperatures are that would have produced the few reported readings, does the mere fact that so many observations went into the analysis give these centennial readings a higher likelihood of being correct than just one random real-world temperature reading would? (you might have set your thermometer out on a frosty winter morn in 1600 or a sunny summer afternoon in 1700) I know that it’s the case in, say, coin flips, that the longterm average is superior but is it the case in borehole temperature reconstructions? OTOH, if there were really only 6 readings for each hole and they were then analyzed to figure out what surface temperature was at a given time, then I’d agree that the conclusions are questionable.

    Getting back to my initial point. If researchers were to calculate separate values for each year based on the temperature gradient in the borehole it would be even more misleading than what they did because there simply isn’t that much resolution in the data. But they should make it clear that the calculations are only for an average for each century and why.

  2. Jean S
    Posted Oct 4, 2006 at 10:37 AM | Permalink

    You can download the article for free from here (Mann) or from here (Bradley). There is also an associated Correction about the wrong areal weighting. Isn’t it strange that certain people have made now fuss about this mistake… see also Pollack, H. N. and J. E. Smerdon, 2004: Borehole climate reconstructions: Spatial structure and hemispheric averages. Journal of Geophysical Research-Atmospheres, 109(D11), available from here.

  3. Steve McIntyre
    Posted Oct 4, 2006 at 11:31 AM | Permalink

    “Bore holes” are mostly mineral exploration drill holes. The calculations supposedly adjust for underground water, but having been underground in mines in the area of some of the drill holes, I question how valid the water adjustments are. It would make some sense to have a mining engineer familiar with the underground terrain opine on these reconstructions before reliance is placed on them.

  4. bender
    Posted Oct 4, 2006 at 11:33 AM | Permalink

    Re #2
    From the Pollack & Smerdon abstract:

    In the 5-degree grid employed for optimal detection, we find that the
    majority of grid element means are determined [in the Mann paper] from three or fewer boreholes, a
    number that is insufficient to suppress site-specific noise via ensemble averaging.

    Another example of uncertainty ignored, suppressed, or mishandled?

  5. bender
    Posted Oct 4, 2006 at 11:41 AM | Permalink

    From the Mann et al, Figure 4 the boreholes in Finland and Northwestern Canada seem to not be functioning as advertised. The 20th century trends actually oppose each other! I wonder why.

    Dendrochronologists take two cores per tree and dozens of trees per stand because they’ve studied ring-width variation at multiple spatial scales and know what it takes to squeeze out the random uncertainty.

  6. Posted Oct 4, 2006 at 12:52 PM | Permalink

    A borehole is a deep and narrow shaft in the ground used for abstraction of fluid or gas reserves below the earth’s surface. If the fluid reserve is under pressure, such as oil or gas, then extra machinery may be required. For water a special type of submersible pump is used to pump water up the rising main.

    A borehole is also a commonly used term in the Environmental consulting and Engineering industries. A borehole is a small diameter hole drilled from the land surface to collecting soil samples. Soil samples are often then, tested in a laboratory to determine geotechnical properties or to assess levels of various chemical constituents. If the boreholes are installed for the purpose of assessing environmental conditions they may be completed as a Monitoring well to collect water samples.
    Jenn
    http://www.minus417.com

  7. Steve McIntyre
    Posted Oct 4, 2006 at 12:55 PM | Permalink

    #6. That’s not correct. Forget what you’ve cut and pasted. Most of the boreholes used for temperature estimation are from hard-rock mineral exploration.

  8. Pat Frank
    Posted Oct 4, 2006 at 12:59 PM | Permalink

    #4 — “suppress site-specific noise via ensemble averaging.”

    I wonder whether noise is the least of their problems. These sites, trees and bore-holes both, should be subject to locale-dependent phenomena. Some phenomena will be periodic, some episodic/sporadic, and some may be solitonic (one-time-happenstance). These will not cancel by ensemble-averaging, but will add and subtract in unpredictable ways. The composite “signal” will reflect all of these accidental juxtapositions of positives and negatives.

    I could be wrong here, but it seems to me that there is a peculiar tendency to treat these data as though they were some time-varying low-intensity electronic signal that includes lots of thermal noise. Average the signal over time and the noise averages out. But tree ring and borehole data are not like that at all. They’re not a mixture of cosines plus white noise. They’re a deterministic chaotic walk plus measurement errors.

    Any ‘low-frequency’ pseudo-period extracted from an ensemble average will just be an artifact of whatever positives and negatives that remain after averaging a bunch of uncorrelated data sets.

  9. Ken Fritsch
    Posted Oct 4, 2006 at 1:20 PM | Permalink

    Re: #6

    The link included in your comment would indicate that it is not meant to be serious?

  10. Steve Sadlov
    Posted Oct 4, 2006 at 1:28 PM | Permalink

    There seem to be groupings of data. My question would be, what do the members of each set that seem to roughly trend together have in common? Maybe the areal distribution is poor and the holes are in areal clusters.

  11. Posted Oct 4, 2006 at 5:45 PM | Permalink

    In addition to #4, Pollack and Swerdon had a similar comment at a meeting in Nice:

    However, the borehole reconstructions use ensemble averaging (“stacking”) as an important filter for removing site-specific effects, and the number and geographic distribution of the boreholes are insufficient to determine robust averages in most five-degree grid elements. In a five-degree grid, 20% of the occupied grid elements contain only one borehole, and more than half contain three or fewer borehole sites.

  12. bender
    Posted Oct 4, 2006 at 5:51 PM | Permalink

    Re #8

    I wonder whether noise is the least of their problems.

    Why do you distinguish these effects from “noise”? Noise can be spatial or temporal.

  13. Pat Frank
    Posted Oct 4, 2006 at 6:31 PM | Permalink

    #12 — right; you’d know better than I. But to suppose that noise averages out on adding individual signals assumes some sort of Gaussian intensity distribution, allowing the noise to average to zero. If the chaotic excursions are non-random, they won’t average to zero.

  14. bender
    Posted Oct 4, 2006 at 6:59 PM | Permalink

    Re #13
    For a small I agree completely; and this looks to be a small sample. (With the usual caveats: I’m no expert on boreholes, hurricanes, pine trees, etc.)

  15. Posted Oct 4, 2006 at 7:20 PM | Permalink

    comment #6 was probably an automated link-spam.

  16. Willis Eschenbach
    Posted Oct 5, 2006 at 12:58 AM | Permalink

    Well, I decided to do my own borehole study. I put together a database of all the European borehole data, available here. It allows you to put in a location (latitude & longitude) and then you can sort the boreholes by distance from that point. For a point, I chose Oxford, UK, since it’s at the center of the CET. I figured that I’d average the boreholes, and compare the result to the CET. In the event, however, it wasn’t necessary. Here’s every borehole in the database within 400 km of Oxford …

    As you can see, the comparison with the CET wouldn’t too meaningful. The average shows a total change of – 0.11°C over four centuries, and rockets up to +0.08° by 1950 …

    Now, this shows that sites only a couple hundred kilometres apart, according to the borehole data, have diverged in average temperature by 3.5°C over 450 years … does anyone believe that?

    The more I look at the proxy data … the more aspirin I need …

    w.

  17. bender
    Posted Oct 5, 2006 at 1:04 AM | Permalink

    You’ve got to be kidding. Willis, would it be too much to also see the mean and standard error for the composite curve?

  18. Willis Eschenbach
    Posted Oct 5, 2006 at 3:47 AM | Permalink

    Sure, bender, here it is. The mean of the composite is -0.01, and the error bars show the 95% confidence interval.

    w.

  19. Willis Eschenbach
    Posted Oct 5, 2006 at 4:34 AM | Permalink

    Well, building on my success with my last study, I decided to extend it to include all of the UK data … here it is, hot off the presses …

    Instead of 13 boreholes, as in the previous study, this one uses 22 boreholes without making much difference. Actually, there is a difference. The study with 13 boreholes showed a 450 year change of 0.08°C, while the new study shows only have of that, a 0.04°C change since 1500 …

    However, this study is preferable to the old one, because the confidence intervals are much narrower …

    w.

  20. BradH
    Posted Oct 5, 2006 at 4:39 AM | Permalink

    Just being a dumb lawyer, I wonder if someone can clear up a couple of points for me.

    First, I can’t for the life of me see how one can estimate past temperature from rock samples (unless it’s done by measuring isotope ratios of certain elements?). What is their rationale for suggesting that boreholes measurements can be temperature proxies in the first place.

    Second, 500-600 years don’t seem real long in the geological scale of things. How deep do you bore to get a 500 year old sample?

    I must admit, this entire concept seems really left-field to me, but many of you seem to be discussing it as if it has at least some credibility (ie. you’re discussing its potential for accuracy, rather than veracity).

  21. Willis Eschenbach
    Posted Oct 5, 2006 at 4:56 AM | Permalink

    BradH, there’s a good description of the theory here.

    I started out to look at the accuracy … but having done my study of the UK boreholes, I’m starting to question the credibility. I can’t see it working at all … I mean, the borehole records just don’t seem reasonable, and some that are quite close to each other show totally different trends.

    w.

  22. BradH
    Posted Oct 5, 2006 at 5:05 AM | Permalink

    Cheers, Willis. I’ll take a look.

  23. Steve McIntyre
    Posted Oct 5, 2006 at 7:18 AM | Permalink

    Willis, the deconstruction of boreholes is long overdue. It would be worthwhile reporting on how these studies adjust for underground water. As I understand the methodology, underground water strata screw up the results. However there is lots and lots of undergroud water – pumping is a big expense at mines. The articles say that they adjust for major water, but there will be minor water as well. I can’t imagine how you can extract usable information out of this.

  24. Paul Linsay
    Posted Oct 5, 2006 at 7:22 AM | Permalink

    The other worry about borehole data not mentioned so far is that this is a diffusion process. As such, the signal decays exponentially with depth. In the famous case of an annual square wave temperature drive, the signal is attenuated by a factor of 16 at a depth of 4 m for typical soil properties. So at 100 m the signal is essentially attenuated to zero. This leaves me doubting that they are measuring anything to do with the surface temperature except near the top layer of rock.

  25. BradH
    Posted Oct 5, 2006 at 7:30 AM | Permalink

    OK, to my lazy eye, there are some things I just can’t figure with this theory.

    First, the theory appears to be that there is a calculable and expected change in temperature with depth. The starting point for measuring departures from what would otherwise be expected is an assumption that, if surface temperatures were constant, and the undisturbed layers below a certain depth (right through the crust, mantle and to the core) are in a steady state and therefore radiating heat upward through the earth at a constant rate, we could calculate a gradient at which the temperature of the ground should change as depth increases (presumably, it first cools for a short distance, then starts to warm up, as internal conduction becomes predominant).

    The next step in the analysis is to assume that surface temperatures change (as we know they do, minute by minute), but that the Earth’s internal radiation remains in a steady state. Now, surely the Earth’s internal heat is not just variable under volcanoes and geysers. Is there any rationale for assuming that subsurface radiation is in a steady state? It obviously is far from a steady state close to volcanoes and geysers, but it beggars
    belief that it is constant elsewhere. [Especially since we’re trying to calculate average temperatures to a sensitivity of less than 1oc per century.]

    Now, if we’re looking at geological scales of 500-odd years, we’re probably not talking about very deep holes. So, the contribution of heat from the core of the Earth is probably not great. However, let’s not forget that we’re also talking about very small variations in “average” temperatures (less than 1oc over the past century vs. greater than 10o variation from night to day, every day). Depending on the amount of heat contribution from down below, at these tiny centennial temperature changes, it could be significant. So, does anyone know whether someone has actually calculated how much heat does/should come from the core? If not, how can a “steady state” be presumed.

    My question regarding how deep these holes are cries out for an answer. However, I know that it will be, “it depends where you are”. Sometimes sediments are deposited quickly, but other times they are deposited slowly. So, this introduces another area of uncertainty – how do you adjust for different rates of deposition of sediments? You can’t just assume that 5 metres depth in one place represents year x in both location A and location B. Additionally, the sediments will be of different characters in every location (even those mere metres apart), so you can’t assume that each borehole will heat/cool at the same rates – it depends on the colour/density/thermal characteristics of the rock in that particular location.

    Which geological genius out there can determine, with the required accuracy, the heat retention/dissipation rates which are to be expected from, say, a vertical metre of “rock”, which consists of thousands of different minerals, mixed together in different combinations every cubic metre? Every vertical centimetre will have different heat retention/dissipation characteristics, depending on its physical makeup. Therefore, another of this theory’s “steady state” assumptions goes out the window – even if you assume that surface temp’s are the same and core radiation is the same, you can’t assume that heat changes will occur at a predictable rate as time goes on and depth gets greater (thereby giving you an accurate temperature proxy), because the radiative properties of the rock you are sampling is inevitably different, from metre to metre.

    Now, I would love for someone to help me understand this a bit better, because at the moment, it seems to me that this entire theory revolves around the following argument:-

    1. There are 3 three variables in the temperature of the underground: surface radiation; core radiation; and rate of heat loss (initially), then gain (with depth).

    2. We can readily assume that radiation from Earth’s core is in a steady state. We can also assume that, in a given location, energy radiates to the surface at a constant rate over a period of at least 500 years.

    3. If we assume both 1 and 2 (ie. there are only 3 components; two are constant and one is variable), the only thing that can change and influence the temperature of the ground at depth is the surface temperature.

    4. We know the heat retention/dissipation characteristics of each metre of rock with sufficient accuracy to extrapolate climate trends of less than 1oc per century.

    Now, to my simple mind, all of this beggars belief. Would someone please point out what I am missing?

  26. charles
    Posted Oct 5, 2006 at 7:44 AM | Permalink

    #20, #23 borehole credibility

    I cannot understand how boreholes would have any credibility unless one knows the thermal conductivity of all the rock surrounding the borehole – all the way down. One must have to make gross assumptions. I doubt as a practical matter thermal conductivity of the surrounding rock can be known sufficienty (e.g. water is just one of the things can will affect the termal conductivity).

    It seems a big stretch to glean historical temp out of boreholes.

  27. Dave Dardinger
    Posted Oct 5, 2006 at 9:00 AM | Permalink

    Brad, charles, etc.

    I’m going to stick my neck out and try recalling what I read about boreholes a few years ago. Anyone who has the true situation please correct me.

    The thing is that there’s no special “depth” at which the year 1500 temperature can be read. This is, as has been pointed out, a diffusion process and heat flow moves in all directions, not just straight down. So if there were, for instance, a short very cold period after a very warm period, the upper layer would get cold, but there would be diffusion from the next lower layer which would “warm” it gradually. How quickly a change of the surface temperature could affect the lower layers is a function of the thermal conductivity of the rocks, but even that is, I believe, blurry in that the speed of heat moving down would be a function of the temperature differential. So if we’re talking say 10 meters from the surface, it might take 100 years for a constant 1 deg C warmer temperature to start affecting our test spot, but a 10 deg C cooler temperature might become obvious in 25 years. So mathematically something similiar to a fourier transform must be done to extract the signals from the actual temperatures at various depths.

    As for charles’ point, it may well be that in manipulating the data as indicated, the rock type becomes less important, though I’m sure it still has some affect on the results. But being more precise will depend on at least some of us reading articles or books on the subject and reporting back. (This is where the power of many minds can produce real results as someone can be or become an expert on borehole math and present it to the group where our stat experts or graph production expert(s) can run off and show us what that means.

  28. Larry Smith
    Posted Oct 5, 2006 at 12:03 PM | Permalink

    It’s not just Mann doing this research, there’s a lot of information out there.

    Harris, R.N., and D.S. Chapman, Borehole temperatures and tree rings: seasonality and estimates of extratropical Northern Hemisphere warming, J. Geophys. Res., 110, F04003, doi: 10.1029/2005JF000303, 2005.

    Bartlett, M.G., D.S. Chapman, and R.N. Harris, A decade of ground-air temperature tracking at Emigrant Pass Observatory, Utah, J. Climate, 19, no. 15, 3722-3731, 2006.

    Hasterok, D., and D.S. Chapman, Continental thermal isostasy I: methods and sensitivity, J. Geophys. Res. (in review, July, 2006).

    Hasterok, D., and D.S. Chapman, Continental thermal isostasy II: application to North America, J. Geophys. Res. (in review, July, 2006).

  29. jae
    Posted Oct 5, 2006 at 12:22 PM | Permalink

    There’s a basic description of borehole theory in the NAS Panel report, if I remember correctly.

  30. Steve Sadlov
    Posted Oct 5, 2006 at 1:11 PM | Permalink

    RE: #26 – you have raised excellent points. Talk about a heterogenous situation. The thermal resistance is all over the map both in space and time.

  31. bender
    Posted Oct 5, 2006 at 3:01 PM | Permalink

    Re #18, #19
    LOL

  32. Hans Erren
    Posted Oct 5, 2006 at 3:09 PM | Permalink

    Chapter 4-14 of Geodynamics by Turcotte and Schubert is titled “Periodic heating of a semi-finite half-space”. It has an analytical solution which is shown below.

    “Thus the temperature variation in the half-space”

  33. IL
    Posted Oct 5, 2006 at 4:21 PM | Permalink

    As a rough rule of thumb, the time taken for heat to be conducted in ‘bog standard’ rock, is t~0.03L2 (that’s L squared). That constant of 0.03 is a combination of heat capacity and thermal conductivity for ‘rock’ and is 0.03 if t is in years and L in metres (mixing funny units I know, but this is actually a handy little expression to know).

    Anyway, for a thermal pulse at a rock surface, after 500 years (say, since that’s the sort of timescale we are discussing here), it will have diffused about 100+ metres (this is a rule of thumb – Hans can do the analytical solution in 32!). That also means that the steady state thermal gradient in your 100m deep borehole will not be affected by heat sources or other effects deeper than 100m+ below the bottom of the borehole on time scales less than about 500 years. As was pointed out, heat conduction is a diffusional process so say 1500-1600 was hotter and 1600-1700 colder than average, there will be a slightly hotter than average thermal gradient at the bottom of the bore hole followed by an averaged out slightly colder above that, etc. The nearer the surface you get, the faster that rock level can respond to current temperature changes. I suppose the presumed beauty of this method is that it averages out temperature changes so that (supposedly) you can see slow century time scale changes that are not swamped by annual variability. That’s the theory.

    Like others, I find it not credible that average century differences of about 1C can be accurately reflected 100m down a borehole without that signal being totally swamped by other factors such as water, variability in thermal conductivities (reflected in changes in that constant of 0.03 so that in some rocks the heat pulse goes faster, in some it goes slower, indeed, the diffusional speed will be variable within the borehole with different lithologies down the borehole!). (Incidentally, an analogy with ice cores comes up here, once the borehole is drilled, it is open to air and water circulation from the surface, these must surely introduce more variability in the walls of the borehole which is where you measure the temperature!?).

    The proof of any hypothesis is in experimental verification. Has this EVER been done? What Willis shows in 18, 19 looks like powerful argument that the variables are too large and swamp any surface signal. Is it even possible to test borehole temperature data? As we have seen, it takes ~500 years to diffuse 100m so to test the borehole thermal gradient experimentally we would have to know surface temperatures accurately over the last 500 years to .. oh dear

    (Anyone want to join me in a round of that circular song, There’s a borehole in the ground, dear Liza dear Liza, there’s a borehole in the ground, dear Liza, a hole.
    Well, measure it dear Michael, dear Michael …..)

  34. Steve McIntyre
    Posted Oct 5, 2006 at 4:29 PM | Permalink

    I can’t believe that any of these guys have ever been down in a mine.

    Also I think that some of these drill-holes will have been open for a considerable period of time before being surveyed by the borehole guys – perhaps even years. It’s not as though they are following the drilling crews around.

  35. TCO
    Posted Oct 5, 2006 at 4:34 PM | Permalink

    We ought to do verification tests by boreholing in the areas with the longest instrumental records. Could see if there is compatability in early temp measurements and in boreholometry. Even if you don’t trust early temp records, the result (agreement or disagreement) would still be an insight. Given, the knowledge that this is being done for validating a method, vice for a specific result some extra effort in statistics and in sampling could be justified, and this might make up for the limited extent of the historical record (300 years vice 500).

  36. BradH
    Posted Oct 5, 2006 at 4:48 PM | Permalink

    Thanks, all. It’s nice to know that others are just as puzzled by this as me. It would be interesting to know what the peer reviewers had to say about it all.

  37. Steve Sadlov
    Posted Oct 5, 2006 at 5:06 PM | Permalink

    RE: #32 – Hans, I fondly (or perhaps not so fondly) recall the problem sets in that book! 🙂

    One of these days I might subject myself to a tune up by redoing some of them ….. (!)

  38. DeWitt Payne
    Posted Oct 5, 2006 at 7:28 PM | Permalink

    Isn’t boring the hole going to disturb the temperature profile? The drill is doing work after all. Don’t you have to have some sort of drilling aid to lubricate and cool the drill as well as to carry off the residue? What’s that going to do?

  39. Joel McDade
    Posted Oct 5, 2006 at 7:30 PM | Permalink

    Well I’m somewhat miffed that we are getting into the periphery of my field of study and yet I still cannot contribute much to CA. As a ground water hydrologist I know all about hydraulic conductivity and nothing about the thermal conductivity.

    If were talking about the upper 100m there is nothing homogeneous about the subsurface. Perhaps deeper, where water bearing fractures (in hard rock) tend to be less frequent. If in a sedimentary environment, well, all bets are off.

    I’m sticking my neck out a bit here because I know next to nothing about borehole temperature reconstruction.

    Steve makes a point that these guys may come in years after the actual drilling. Are these still open hole ‘boreholes’? (Could only be possible in hard rock environs — everywhere else casing and well screen across specific zones would be required.) But if mining in hard rock areas, maybe they are open.

    Whatever the case, the boreholes will be filled with water. IMO the temperature of water at any point in the borehole would be dominated by proximity to a large water-bearing fracture, with its unique temperature. The water in that fracture will have moved up or down in the bedrock aquifer depending on the local or regional vertical hydraulic gradients. So I would favor hydraulic propogation of temperature rather than through thermal conductance of rock.

    Admittedly I don’t know the subject of borehole temps. I have done tons of testing for hydraulic properties of aquifers. They are usually so heterogenious that I feel lucky if I am within an order of magnitude (and I never really know).

    I could well imagine that the thermal properties are NOT so variable but I have learned over the years that, when it comes to the subsurface, any time some guy stick a probe in the ground with a claim to precision, I really wonder about that claim.

  40. Hugo
    Posted Oct 5, 2006 at 7:39 PM | Permalink

    Before the discussion goes any further into the void, it may be a good idea to have a look at some of the basic references on borehole climatology. There are a great number of papers on the subject, including some non-mathematical notes. A few of them are listed below. Some are introductory, others deal with some of the problems encountered in borehole climatology. Some of these papers are available online at the websites given below. Good reading.

    Climate Change from Earth Temperatures: The Big Integrator
    Hugo Beltrami and David S. Chapman

    Beltrami, H. and Chapman, D.S. (1994). Drilling for a Past Climate. New Scientist, 142, 36-40, April 23.

    Mareschal, J.C. and Beltrami, H. (1992) Evidence for Recent Warming from Perturbed Geothermal Gradients: Examples from eastern Canada. Climate Dynamics, 6, 135-143

    Shen, P.Y., Wang, K., Beltrami, H. and Mareschal, J.C. (1992). A Comparative Study of Inverse Methods for Estimating Climatic Histories from Borehole Temperature Data. Global and Planetary Change, 98, 113-127.

    Beck, A.E., Shen, P.Y., Beltrami, H., Mareschal, J.C., Safanda, J., Sebagenzi, S., Vasseur, G. and Wang, K. (1992) A Comparison of 5 different Analyses for Interpreting 5 Datasets. Global and Planetary Change, 98, 101-112.

    Beltrami, H. and J.C. Mareschal (1995). Resolution of Ground Temperature Histories Inverted from Borehole Temperature Data, Global and Planetary Change . 11, 57-70.

    A. Hartmann, V. Rath: Uncertainties and shortcomings of ground surface temperature histories derived from inversion of temperature logs, Journal of Geophysics and Engineering, 2, 299-311, doi:10.1088/1742-2140/2/4/S02, 2005

    England, A.W., X. Lin, J.E. Smerdon, and H.N. Pollack (2003), The influence of soil moisture upon the geothermal climate signal, IGARSS Proceedings, IEEE International, 1, 419-421. (pdf)

    Lin, X., J.E. Smerdon, A.W. England, and H.N. Pollack (2003), A model study of the effects of climatic precipitation changes on ground temperatures, Journal of Geophysical Research – Atmospheres, 108(D7), doi:10.1029/2002JD002872. (pdf)

    Beltrami H. and Mareschal, J.C. (1991). Recent Warming in Eastern Canada Inferred from Geothermal Measurements. Geophysical Research Letters, 18, 605-608.

    Beltrami, H. and Taylor, A. (1995). Records of Climatic Change in the Canadian Arctic: Towards Calibrating Oxygen Isotope Data with Geothermal Data. Global and Planetary Change, 11, 127-138.

    D. Mottaghy, V. Rath: Latent heat effects in subsurface heat transport modeling and their impact on paleotemperature reconstructions,164 (1), 236-245, doi: 10.1111/j.1365-246X.2005.02843.x , 2006.

    “Borehole Climatology websites with references”
    url1
     

  41. Hugo Beltrami
    Posted Oct 5, 2006 at 7:45 PM | Permalink

    Before the discussion goes any further into the void, it may be a good idea to have a look at some of the basic references on borehole climatology. There are a great number of papers on the subject, including some non-mathematical notes. A few of them are listed below. Some are introductory, others deal with some of the problems encountered in borehole climatology. Some of these papers are available online at the websites given below. Good reading.

    Climate Change from Earth Temperatures: The Big Integrator
    Hugo Beltrami and David S. Chapman
    http://esrc.stfx.ca/borehole/borehole.html

    Beltrami, H. and Chapman, D.S. (1994). Drilling for a Past Climate. New Scientist, 142, 36-40, April 23.

    Mareschal, J.C. and Beltrami, H. (1992) Evidence for Recent Warming from Perturbed Geothermal Gradients: Examples from eastern Canada. Climate Dynamics, 6, 135-143

    Shen, P.Y., Wang, K., Beltrami, H. and Mareschal, J.C. (1992). A Comparative Study of Inverse Methods for Estimating Climatic Histories from Borehole Temperature Data. Global and Planetary Change, 98, 113-127.

    Beck, A.E., Shen, P.Y., Beltrami, H., Mareschal, J.C., Safanda, J., Sebagenzi, S., Vasseur, G. and Wang, K. (1992) A Comparison of 5 different Analyses for Interpreting 5 Datasets. Global and Planetary Change, 98, 101-112.

    Beltrami, H. and J.C. Mareschal (1995). Resolution of Ground Temperature Histories Inverted from Borehole Temperature Data, Global and Planetary Change . 11, 57-70.

    A. Hartmann, V. Rath: Uncertainties and shortcomings of ground surface temperature histories derived from inversion of temperature logs, Journal of Geophysics and Engineering, 2, 299-311, doi:10.1088/1742-2140/2/4/S02, 2005

    England, A.W., X. Lin, J.E. Smerdon, and H.N. Pollack (2003), The influence of soil moisture upon the geothermal climate signal, IGARSS Proceedings, IEEE International, 1, 419-421. (pdf)

    Lin, X., J.E. Smerdon, A.W. England, and H.N. Pollack (2003), A model study of the effects of climatic precipitation changes on ground temperatures, Journal of Geophysical Research – Atmospheres, 108(D7), doi:10.1029/2002JD002872. (pdf)

    Beltrami H. and Mareschal, J.C. (1991). Recent Warming in Eastern Canada Inferred from Geothermal Measurements. Geophysical Research Letters, 18, 605-608.

    Beltrami, H. and Taylor, A. (1995). Records of Climatic Change in the Canadian Arctic: Towards Calibrating Oxygen Isotope Data with Geothermal Data. Global and Planetary Change, 11, 127-138.

    D. Mottaghy, V. Rath: Latent heat effects in subsurface heat transport modeling and their impact on paleotemperature reconstructions,164 (1), 236-245, doi: 10.1111/j.1365-246X.2005.02843.x , 2006.

    “Borehole Climatology websites with references”

    http://www.ldeo.columbia.edu/~jsmerdon/publ_jes.html

    http://esrc.stfx.ca/publicat.html

    http://www-users.rwth-aachen.de/volker.rath/publications.html

    http://www.coas.oregonstate.edu/index.cfm?fuseaction=faculty.detail&id=700&show=publications

  42. jae
    Posted Oct 5, 2006 at 7:57 PM | Permalink

    Better read that stuff Hugo posted, before commenting on this subject. Nice to have an expert visit!

  43. ET SidViscous
    Posted Oct 5, 2006 at 8:08 PM | Permalink

    “everywhere else casing and well screen across specific zones would be required.) But if mining in hard rock areas, maybe they are open. ”

    All open holes have to be capped and marked.

    No open hole can be left in the ground, it needs to be backfilled.

    Well in the developed world, I don’t know about other places, but it is unlikely, Even a small diameter hole is dangerous.

  44. jae
    Posted Oct 5, 2006 at 8:10 PM | Permalink

    42: Yes, most are filled with bentonite.

  45. Joel McDade
    Posted Oct 5, 2006 at 8:57 PM | Permalink

    #42, #43 The upper part of a borehole can be cased off and the bedrock section can be left open indefinitely, provided of course it is not a flowing artesian well. My 80 year old mom has one in her backyard in North Carolina.

    #40 I am reading Hugo’s “Perturbation of ground surface temperatures reconstructions by ground water flow” That seems most pertinant to my interests. Thanks for posting Hugo.

  46. john lichtenstein
    Posted Oct 5, 2006 at 9:45 PM | Permalink

    Willis I think the graphs would make more sense if the lines all hit 0 at present. Otherwise the impression is that there was some convergence in the past. Also the error bars must be incomplete. It doesn’t make sense that the uncertainty goes down as we look back into the past. Is there any error estimate from the original data. I saw you post a defendable variance adding formula the other day so it would be great if you could apply that here.

  47. Steve McIntyre
    Posted Oct 5, 2006 at 10:40 PM | Permalink

    #41. Hugo Beltrami is a smart, serious and very charming guy who believes in boreholes. I don;t get it, but I haven’t spent much time at it. It’s nice that he’s dropped in; please pay attention to what he says.

  48. Willis Eschenbach
    Posted Oct 6, 2006 at 12:37 AM | Permalink

    John, the error estimate goes down in the past because there is less variance between the estimates in the past. I have presented these as they are typically presented, with the zero point in the past. However, you are likely right that I should put the current value at 0. That way, the error estimates would increase as we go back into the past.

    I am overjoyed that Hugo Beltrami has come to discuss this matter, as I have some questions about his text. First, thank you for the text, “Climate Changes from Earth Temperatures: The Big Integrator” (hereinafter “CCET”), it is quite fascinating.

    You present the following graph in your study:

    (ORIGINAL CAPTION) Figure 2. Borehole depth-temperature profiles for three regions of North America. (a) Alaska from the work of the United States Geological Survey.

    I looked at these and thought “huh”?

    For one thing, all of the records go to a depth of zero, which I haven’t seen in other data. Also, all of them were very smooth, which is unusual for boreholes. So I went to the USGS, where the data for Alaska is kept. I could not locate the AWU (nearest was “AWUN” or the SB3 data (nearest was “SBE”), but I found the records for the other three. Let me look at them one by one, but first, note that at a depth of 60 metres, all of the records in CCET show a warming of about 1°C. Here are the records:

    CTD Site This is the only site where the data goes to the surface. The data collection method is also different, being listed as “thermistor string (frozen in place)”. Note how the surface is much colder in May than in July.

    Note also that this looks nothing like the CTD site datashown in CCET. There is no 1°C rise at 60 metres depth. The maximum rise is at 20 metres depth, and decreases above that. None of these are visible in the CCET.

    KAG Site This site is of interest because it was sampled five separate times over five years. Once again, none of these samples show a 1°C warming at 60 metres depth. Nor do any of them look like the drawing in CCET.

    Note that the borehole has cooled from the time of the first sample, presumable due to the exposure to the frigid air.

    WDS Site This site was also measured several times. Unfortunately, the first three measurements did not extend up near the surface. In any case, none of them show either the overall shape, or the 1°C warming at 60 m depth that is shown in CCET.

    Now, if the earth is in fact the big integrator, it seems impossible that these discrepancies would exist. Should we actually believe that the KAG site has warmed by 3° in recent history, while two other sites just a few miles away have not warmed appreciably?

    And if so, if these wide differencences are verifiable results, what does it say about our ability to reconstruct areawide temperatures from these records?

    Dr. Beltrami, perhaps you could explain these discrepancies?

    w.

  49. Willis Eschenbach
    Posted Oct 6, 2006 at 4:12 AM | Permalink

    Well, more peculiarities … first I have to explain T_0 . T_0 is the calculated long term temperature average for the site. It is calculated by extending the trend of the deep temperature vs. depth line up to the surface. Here is an example. I have added a light blue line to the graph to show T_0 .

    Notice that the change in temperature is relative to T_0 .

    John in #46 asked that I not start the temperature trend lines at 0 in the year 1500. So the question arises … where should I start them? Since the trends are relative to T_0 , it is most reasonable to start them there. Here is the result:

    Now, looking at this, I noticed that it seemed like the warmer that the temperature started out (higher T_0 ), the more it seemed that they tended to cool. So I looked to see if this was just an illusion …

    In fact, T_0 explains 58% of the variance in the temperature trend (p = .0001), which is very strange. Why should that be? Particularly since it breaks down that if T_0 is greater than 8°C, it’s more likely to cool, and if T_0 is less than 8°C, it’s more likely to cool … what’s up with that?

    w.

  50. Willis Eschenbach
    Posted Oct 6, 2006 at 4:15 AM | Permalink

    … posting late … it should say

    … if T_0 is less than 8°C, it is more likely to warm

  51. Willis Eschenbach
    Posted Oct 6, 2006 at 4:19 AM | Permalink

    … yes, it is late … in fact, it should say

    … if T_0 is greater than 9°C, it’s more likely to cool, and if T_0 is less than 9°C, it is more likely to warm …

  52. Dave Dardinger
    Posted Oct 6, 2006 at 9:19 AM | Permalink

    Interesting stuff, Willis. A guess on your last graph. Could we be seeing some sort of regression to a mean? That is there’s some “ideal” temperature the boreholes are working towards so if they’d been hot it was more likely they’d cool and contrawise if they were cool. I.e. it’s more a statistical affect than anything else.

  53. Steve Sadlov
    Posted Oct 6, 2006 at 9:26 AM | Permalink

    RE: “They are usually so heterogenious that I feel lucky if I am within an order of magnitude (and I never really know).”

    Indeed.

  54. IL
    Posted Oct 6, 2006 at 1:08 PM | Permalink

    Thanks to Hugo for giving those references, I hope to comment more when I have more time. In the ‘Big integrator’ article referred to by Willis in #48 it looks like that data for the graph referred to as figure 2 in #48 comes from Lachenbruch and Marshall, Science, Vol 234, 1986 p690-696 (No references are given in the ‘Big Integrator’ pdf) but I found that article and that’s where it is.

    The work cited that is offered as experimental verification of the theory of borehole palaeotemperatures in the ‘Big integrator’ article is the Chisholm work in Utah which as far as I tell must be JGR-Solid Earth Vol 97(B10) 14155 (1992) and Global and Planetary Change Vol 98, 1992 p269. Unfortunately electronic holdings of the latter 2 journals only go from 1994 so it looks like actually going to the library! and I can comment more when I have done so.

    The quick point I was going to make to Willis is that the Lachenbruch and Marshall data long precedes the data he has posted in #48 so does this mean that the early measurements are shown to be somewhat idealised and inaccurate compared to more modern data?
    (Willis, if you wish to have these articles and don’t otherwise have access, Steve has my private contact)

  55. Earle Williams
    Posted Oct 6, 2006 at 7:24 PM | Permalink

    Re #48

    A little background info on the USGS data from Alaska …

    I don’t know about all the Alaska sites identified in the Big Integrator, but the holes north of the Brooks Range are mostly from oil exploration in the 1960’s and 1970’s. The USGS site at http://esp.cr.usgs.gov/data/bht/alaska/ provides the data that Willis is referencing. The upper portion of these North Slope holes have been filled with diesel fuel to prevent icing. The J. W. Dalton #1 well (east of the WDS, West Dease #1) was recently plugged and abandoned, as it was at risk of breach due to coastal erosion. If my rememberer is functional, several hundred gallons of diesel were pumped out of the hole.

    A USGS publication analyzing much of this data can be found at:

    Deming, David, Sass, J.H., and Lachenbruch, A.H., 1996, Heat flow and subsurface temperature, North Slope of Alaska, in Johnsson, M.J., and Howell, D.G., eds., Thermal evolution of sedimentary basins in Alaska: U.S. Geological Survey Bulletin 2142, p. 21-44.

    http://www.dggs.dnr.state.ak.us/scan2/b/text/B2142.PDF (be warned, 8 MB file)

  56. Pat Frank
    Posted Oct 6, 2006 at 9:52 PM | Permalink

    This may be relevant to the borehole evaluations discussed here. This site: http://www.smu.edu/geothermal/BHT/BHT.htm is to an extended abstract of a poster presented at the AAPG Meeting, Dallas, Texas, April 2004, poster title: “Calibration of the AAPG Geothermal Survey of North America BHT Data Base,” presented by David Blackwell and Maria Richards, SMU Dept of Geological Sciences, Dallas, TX.

    The poster itself is available as a pdf at the site. Blackwell and Richards show that a regional geothermal gradient, not a general geothermal gradient, must be corrected out to remove a bias in error with temperature gardient from the data. The corrected errors ranged from +/- 5C out to +/- 20C, but showed no bias with gradient.

    Comparing those errors with Willis’ first three plots in this thread shows us why the diveregence among measurements is so high. As depth = years in Willis’ plots, I wonder how or whether the geothermal gradient bias was evaluated and removed from those data.

    I went to the American Association of Petroleum Geologists (AAPG) web-site to look at the basic reference for the work, but the AAPG Bulletin requires a member log-in. However, I did discover a wonderful opportunity for you, Willis. The AAPG is encouraging us to “Be a Citizen Scientist“. They want to “engage students and the public in conducting real “citizen science” research” so that we can “discover the Earth sciences and promote responsible stewardship of the Earth.

    How much more encouragement and evidence of openness to outsider input does anyone need than that? I think you should submit your work as a letter to the AAPG Bulletin, Willis, citing their encouragement.

  57. john lichtenstein
    Posted Oct 6, 2006 at 10:04 PM | Permalink

    Bender do you see now why people leave error bars off of charts?

    Willis, if it’s SOP to use T0 the go with that. But we know the temp now, it’s the past where we are estimating.
    And did the datasets come with any guidance on the precision of the estimate? The folks who do these studies must report error right?

  58. bender
    Posted Oct 6, 2006 at 10:20 PM | Permalink

    Re #57 Yes. They are only trying to cut down on the use of toxic inks in order to save the planet.

  59. Willis Eschenbach
    Posted Oct 6, 2006 at 11:25 PM | Permalink

    A Tale of Two Boreholes

    Well, in my further investigations of the borehole mysteries, I chanced across an interesting paper about boreholes in Utah, entitled Borehole Temperatures and a Baseline for 20th-Century Global Warming Estimates, Robert N. Harris and David S. Chapman. Their abstract says:

    Lack of a 19th-century baseline temperature against which 20th-century warming can
    be referenced constitutes a deficiency in understanding recent climate change. Combination
    of borehole temperature profiles, which contain a memory of surface temperature
    changes in previous centuries, with the meteorological archive of surface air
    temperatures can provide a 19th-century baseline temperature tied to the current observational
    record. A test case in Utah, where boreholes are interspersed with meteorological
    stations belonging to the Historical Climatological Network, yields a noise
    reduction in estimates of 20th-century warming and a baseline temperature that is 0.6°
    ± 0.1°C below the 1951 to 1970 mean temperature for the region.

    OK, sounds good. Looking into their boreholes, however, reveals an interesting story.

    First, the data. Most of the datasets are located here. Here are a couple of the records, as analyzed by the University of Michigan.

    When I first saw these two, I thought “huh?” I thought maybe there was some mistake, that maybe they’d analyzed one set of records twice with different assumptions or something. But looking further, I found that these boreholes are only three miles (5 km) apart, so then it made sense.

    So I looked at another pair of borehole records.

    Now, you may ask, what’s odd about these two borehole reconstructions? What’s odd is that these two are only five miles (8 km) apart. Eight km, and one says almost no warming 1900-1950, while the other has warmed a full degree in 50 years. They start and end together, and in between they diverge by a full degree?

    Does that make sense? What kind of change in the climate could cause that?

    Then, they compare these borehole records to local surface air temperatures (SAT) from the US Historical Climate Network (USHCN). They identify the six local temperature stations that they say are comparable to the boreholes. So I got those records as well. Most of them agree quite closely with the figures reported in their paper, except one. That one is way off.

    They show data from the 1890s for Bluff, while the USHCN records only start in 1911. And more to the point, while all of the other stations warmed over the period of record, Bluff actually cooled. Nor is this a trivial error, as the actual Bluff records are very different from both the “Bluff” record they quote, and from the other SAT records they use. The main difference is that they adjust the borehole records based on the modern SAT records, and the modern records they use are incorrect … which brings all of their conclusions into question.

    But the main mystery to me is that two boreholes 5 miles apart can be so different … what’s up with that?

    w.

  60. BradH
    Posted Oct 7, 2006 at 4:52 AM | Permalink

    Thanks for your participation, Hugo. I defer to your expertise and I have read a few of the papers (those I’m able to understand, to some extent, anyway).

    I must admit that they theory has a certain logic and attraction to it. However, there are still two points which occur to me.

    First and foremost is the question of margin of error. When arguing over whether or not temperatures are increasing due to human activity, we are talking about less than 1oc over a century or more. It seems a big call (to me) to assert that this method can resolve such small variations, over such long time-frames.

    Second, most of the graphs seem to show a dramatic increase from around 1900 onwards. Now, when any drillhole is drilled, the drilling equipment creates an enormous amount of heat at the surface. Modern drilling technology has only allowed us to drill deep holes in remote locations from around the start of the 20th century (when we perfected the internal combustion engine). I wonder whether some of the heat signal being detected from recent times is the heat from the boring process, itself. If this were the case, I wonder if the same techniques used for boreholes could be applied to some of the old hand-mining shafts.

  61. Eduardo Zorita
    Posted Oct 7, 2006 at 8:28 AM | Permalink

    #60

    The methodology itself is able to retrieve this 1-K temperature change in the past few centuries, at least with “synthetic” boreholes created from simulated surface temperatura data.

    F. Gonzalez-Rouco,H. Beltrami, E. Zorita, H. von Storch. Simulation and inversion of borehole temperature profiles in simulated climates: Spatial distribution and surface coupling. Geophys. Res. Letters 33, L01703 (2006).

    Real boreholes, however, contain other sources of noise (vegetation changes, subsurface water, etc) as has been discussed here previously, and probaby the uncertaunties are larger than in an ideal exercise.

  62. fFreddy
    Posted Oct 7, 2006 at 9:13 AM | Permalink

    Re #61, Eduardo Zorita

    … and probaby the uncertaunties are larger than in an ideal exercise.

    Do you believe that the uncertainties are small enough that real world boreholes give useful information about past climate ?

  63. TCO
    Posted Oct 7, 2006 at 9:29 AM | Permalink

    Does the simulation use an idealized thermal conductivity? Not talking about water but just about variation in the thermal conductivity of the rock as one goes down as well as accurate measurement of that a given hole’s thermal conductivity is.

    Nevertheless, you post is helpful to at least show why there is some promise in using the big integrator. It boggles the mind a bit to think that rock can show temp changes of a degree when the overall temp is wiggling around from day to day…but when you think about it, it sort of makes sense in the big integrator method.

    P.s. Good job on actually writing and publishing a science PAPER.

    P.s.s. Solid state chem is the true religion. You heretic.

  64. Posted Oct 7, 2006 at 9:35 AM | Permalink

    #61. Eduardo, do you have a free reference to the equations that are used for the inversion of observed borehole temps to surface temps? I am interested in any assessment on the stability of these equations. Is it possible the sensitivity of the equations to parameter values leads to these differences: e.g. the inverse of the difference of large numbers or something similar? Cheers

  65. TCO
    Posted Oct 7, 2006 at 9:48 AM | Permalink

    Great insight for something to look at Dave.

  66. Eduardo Zorita
    Posted Oct 7, 2006 at 10:15 AM | Permalink

    #62-63-64

    Yes, I believe that this is the case for large-scale temperatures.

    Yes, the simulations uses an idealized thermal conductivity. The work was devoted to test the mathematical methodology in an idealized, albeit reasonably complex, world

    You can find good descriptions of the equations and the inversion methods in the references provided by Hugo before, which is the real expert on the field..

  67. Ken Fritsch
    Posted Oct 7, 2006 at 12:21 PM | Permalink

    Re: #49

    In fact, To explains 58% of the variance in the temperature trend (p = .0001), which is very strange.

    Your graphs and criticisms remain clear and easy to understand, Willis E, but to pick nit, your graph shows an R^2 = 0.3386 which means that approximately 34%, not 58% of the variation is explained by the To. I would think that r = 0.58 in this case.

  68. Willis Eschenbach
    Posted Oct 7, 2006 at 1:28 PM | Permalink

    Re 67, the variance is the square root of the R^2 statistic.

    Thanks,

    w.

  69. Henry
    Posted Oct 7, 2006 at 5:27 PM | Permalink

    On 67 and 68, I am with Ken in his nit-picking – the square root of r^2 is the “correlation coefficient” but the share of variance explained is r^2 (unless as Steve McIntyre pointed out in another thread you do something strange like scaling when it could be something odd like 2r-1).

  70. Willis Eschenbach
    Posted Oct 7, 2006 at 7:03 PM | Permalink

    Re 69 and 67, you are correct, my bad.

    w.

  71. Willis Eschenbach
    Posted Oct 8, 2006 at 12:11 AM | Permalink

    MORE TWINS

    Having found the problems with nearby boreholes, I decided to see what others were around. I calculated the distance between all possible European hole pairs, and searched for pairs less than 10 km. apart. Here’s the first pair I found:

    You can see the problem. Here’s the second pair I encountered:

    Same thing, they diverge badly. But then I hit a cluster of five adjacent boreholes in Czechoslovakia. One of the borehole temperatures dropped so far that I had to increase the scale. Here are the five boreholes:

    Nor are these isolated instances … here’s three boreholes from Rumania, all within 1 km. of each other:

    Now, I have to confess, I’ve reached the end of my understanding here. Dr. Hugo or somebody knowledgeable from the borehole world has to explain this to me. We have 5 boreholes In Czechoslovaia. Two of the boreholes, which are within one km. of each other, have diverged from each other by more than three and a half degrees, one warming and one cooling. We have three boreholes in Rumania, all within one km. One shows warming, one shows cooling, and one is about level. My questions are:

    1) If we are making a temperature history of one of these spots … which borehole should we believe?

    2) Given these records of nearby sites, what uncertainty should we associate with any single borehole record?

    3) When one borehole warms and an adjacent one cools, do we say the site is warming, or cooling?

    3) Mann’s reconstruction (above) shows a total change in 450 years of 0.6°C, with a 95% confidence interval of 0.2°C. The 95% confidence interval on the one single spot on the earth with the 5 borehole records is 1.2°, not counting whether any of these measurements accurately represent reality. Thus, the single borehole record uncertainty must be larger than 1.2°C, perhaps much more. And the uncertainty must increase when we try to estimate the global temperature from very uncertain individual borehole records. So are Mann’s results believable?

    w.

  72. Hans Erren
    Posted Oct 8, 2006 at 4:15 AM | Permalink

    Wiilis,

    Borehole temperature is strongly dependent on geology and geohydrology. Are you sure the wells you are comparing are comparable?

  73. BradH
    Posted Oct 8, 2006 at 4:40 AM | Permalink

    Re: # 60

    Thanks Eduardo,

    You answer goes directly to my questions about the sensitivity of the method, given the small changes which modellers and reconstructionists are attempting to capture.

    No silver bullet, then.

  74. TCO
    Posted Oct 8, 2006 at 7:21 AM | Permalink

    Willis, I think the basic question comes down to something like inherent variability and how uncertainty is affected by that. If you are really interested in a specific site (and the variability is just noise), then you need to average a lot of holes. Maybe on a larger scale, some of this averaging takes care of itself. (Not sure that you need to subaverage for locations). The more troubling thing is: does the large variation indicate some form of non-random noise such as experimental method or water or something like that which throws the whole method in doubt.

    Nevertheless, I think the starting point is to look for the basic analysis of sample numbers required for a certain accuracy. What has been published on that? I would think that in analogy to papers on tree rings that say how many times you need to core an individual tree or a stand to get a particular accuracy. Those exist right? Seems basic, no?)

  75. Steve McIntyre
    Posted Oct 8, 2006 at 8:15 AM | Permalink

    Willis, can you give a URL for the borehole data archive? I’ll insert it in the head post.

  76. Willis Eschenbach
    Posted Oct 8, 2006 at 1:46 PM | Permalink

    Re #75, Steve M., the main borehole data is located at http://www.ncdc.noaa.gov/paleo/borehole/core.html.

    The Alaskan borehole data is located at http://esp.cr.usgs.gov/data/bht/alaska/“.

    Mann’s borehole data is in the Mann2003 folder at http://www1.ncdc.noaa.gov/pub/data/paleo/borehole/.

    The Utah boreholes are found in the “North America” section of the main borehole site, at http://www.ncdc.noaa.gov/paleo/borehole/nam.html

    And the !@#$%^&* link button isn’t working for me …

    re #72, Hans, you say:

    Borehole temperature is strongly dependent on geology and geohydrology. Are you sure the wells you are comparing are comparable?

    I would assume that would be considered, and these variations removed, during the analysis of the data. However, these graphs I’ve been showing you are the analysed, 100 year averages of the results …

    Finally, TCO, thanks for your contribution. You say:

    Willis, I think the basic question comes down to something like inherent variability and how uncertainty is affected by that. If you are really interested in a specific site (and the variability is just noise), then you need to average a lot of holes. Maybe on a larger scale, some of this averaging takes care of itself. (Not sure that you need to subaverage for locations). The more troubling thing is: does the large variation indicate some form of non-random noise such as experimental method or water or something like that which throws the whole method in doubt.

    Nevertheless, I think the starting point is to look for the basic analysis of sample numbers required for a certain accuracy. What has been published on that? I would think that in analogy to papers on tree rings that say how many times you need to core an individual tree or a stand to get a particular accuracy. Those exist right? Seems basic, no?)

    Unfortunately, for most specific sites, there is only one borehole record in a 5° x 5° gridcell, so it’s not possible to “average a lot of holes”.

    I have not yet found the paper you described that sets out the accuracy of the record. Dr. Beltrami found good results in Canada, for example … but what we need is not that, but an analysis of bad results to let us know the uncertainties. I haven’t seen that yet.

    My thanks to all,

    w.

  77. Ken Fritsch
    Posted Oct 8, 2006 at 2:20 PM | Permalink

    Re: #74

    Nevertheless, I think the starting point is to look for the basic analysis of sample numbers required for a certain accuracy. What has been published on that? I would think that in analogy to papers on tree rings that say how many times you need to core an individual tree or a stand to get a particular accuracy. Those exist right? Seems basic, no?)

    From the near proximity twin and multiple hole graphs Willis E has presented here, I would think that, as he has indicated, one needs more information to explain these differences before doing more statistics on their collective. Perhaps an expert can explain the differences and a prior have accounted for them, but I doubt that a good statistician or scientist would proceed beyond this point without it.

  78. bender
    Posted Oct 8, 2006 at 2:40 PM | Permalink

    This borehole reconstruction data looks very shaky, perhaps even worse than the tree ring reconstructions. Now you see why I’m anal retentive about accurate estimates of precision on all measurements and publishing of standard errors on all parameter estimates. When the uncertainty is this large I have little confidence in the conclusions. But what is even worse is when the uncertainty is not addressed. Then I start to become suspicious of the field as a whole. There are only two reasons for not discussing uncertainty: (1) you don’t know how to properly estimate it and you don’t want to expose your ignorance, (2) you don’t want to admit how large it is because it disfavors your pet hypothesis. Experience has showed me that scientists who measure things precisely are always eager to show off the narrowness of their confidence intervals.

  79. Posted Oct 10, 2006 at 9:09 AM | Permalink

    I’m sincerely enjoying the discussion on borehole usage as a proxy for past atmospheric temps. But is the amount of intellectualizing on this very specific subject more a reflection of the fact that we [I mean the USofA]spends way too much money on AGW? Have we not reached a point were we can begin to clean things up a bit and stop spending so much effort/money on what appears to be an all to obvious windmill.

    In addtion to borehole data concerns, there are concerns with how air temp./proxy data from radiosondes, satellites, surface thermometers, oxygen isotope, element ratios from tests, etc. are manipulated. And not to mention, GCMs and how they seemingly over use CO2 as a general predictor of near-future air temps and either don’t include clouds and aerosols, etc., etc.

    I understand that CO2 actually absorbs energy over a fairly limited range of the electomagnetic spectrum. If that’s truely the case then its seems to me that there should be more focus on just how much can we expect air temps. to increase given a certain increase in CO2.

2 Trackbacks

  1. […] Complexity of Climate Tropical Tropospheric Amplification ‚an invitation to review this new paper Underground Problems with Mann-Holes Where Are The Corpses? Where’s the Climate Beef? Climate Actually Changes! Film at 11:00! Willis […]

  2. […] Underground Problems with Mann-Holes :: Problems with the use of boreholes in the earth as temperature proxies. Willis on Hegerl :: Review of a paleoclimate reconstruction. When Good Proxies Go Bad :: Analysis of the proxies in the Mann 2008 paper on temperatures of the previous millennium. Can’t See the Signal For the Trees :: Cluster and similarity analysis of the Mann In Which I Go Spelunking … :: Cave records as evidence of climate. […]