"Unprecedented" in the past 153 Years

De’ath et al (Science 2009) here SI received a considerable amount of press at the start of 2009. De’ath et al reported that the there was an “unprecedented” decline in Great Barrier Reef coral calcification:

The data suggest that such a severe and sudden decline in calcification is unprecedented in at least the past 400 years

A climate scientist not involved in the study said that the findings were “pretty scary”. And so on. An occasional Australian reader brought the data set to my attention the other day. There are some interesting aspects to the data set (a collated version of which I’ve placed online in tab-separated csv form here. Original data is online at ncdc/paleo.

[Update; see also https://climateaudit.org/2009/06/12/a-small-victory-for-the-r-philosophy/%5D

Here is an excerpt from De’ath Figure 2 showing the “scary” decline in calcification that the scientists have alerted the public to, of which they observed:

since 1990 [calcification] has declined from 1.76 to 1.51 g cm−2 year−1, an overall decline of 14.2% (SE = 2.3%). The rate of this decline has increased from 0.3% per year in 1990 to 1.5% per year in 2005.


Figure 1. Smoothed calcification for the 20th century,


Figure 2. Smoothed calcification 1572-2005.

I thought that it was a little surprising to see the presentation of trends calculated over such short periods, a practice much criticized in the blogosphere, but I guess that the PRL (Peer Reviewed Literature) – or, at any rate, Science – takes these concerns less seriously. The heavy smoothing is also troubling – Matt Briggs won’t be very happy.

To see the impact of unsmoothed data, I did a simple plot of the average calcification by year over the data set. I understand that the coral data spans a considerable length and that various sorts of adjustments might be justified, but it’s never a bad idea to plot an average. Here are two plots, showing a simple average, first from 1572-2005 and then in the 20th century. Based only on the first plot, one could not say that even the 2004-2005 results were “unprecedented in at least 400 years” – values in 1852 were lower. So I can confirm that the values before adjustment are unprecedented since at least 1852.

Visually, this graph looks to me like calcification has been increasing over time, with a downspike in 2004-5, but, as my critics like to observe, I am not a “climate scientist” and therefore presumably unqualified to see the downward trend that was reported by the Wizards of Oz.


Figure 3. Average Calcification by Year.

Here’s a similar plot for the 20th century also showing the count of sites – which has declined sharply in the past 15 years with only one site contributing nearly all the 2005 values. Curiously, there is a very high correlation (0.48) between calcification and the number of measurements available in a year. The unsmoothed data gives a very different impression that the Science cartoons. Unsmoothed, years up to 2003 were not particularly low; it’s only two years – 2004 and 2005 – that have anomalously low values. But it seems a little premature to conclude that this is a “wide-scale” trend as opposed to a downspike -which occur on other occasions in the record.


Figure 4. Average Calcification by Year, 1900-2005. Also showing the number of sites (2 in 2005).

Now there may well be sensible adjustments to the average. The cores don’t come from the same site and the average latitude may vary. The average age of a core varies by year. These are the sorts of problem that dendros deal – an analogy that the authors don’t mention.

The authors do not actually show averages but “partial effects” plots obtained from the lmer program in the R package lme4. As it happens, I’ve used this very package to make tree ring chronologies – I think that I’ve illustrated this in some older posts. So I think that it’s fair to say that there probably aren’t a whole lot of readers who are better placed than I am to try to figure out how their adjustments are done. But as so often in climate science articles, the methodological descriptions don’t permit replication. I’ve spent some time on this and don’t understand how they constructed their model and, if I can’t, I can’t imagine that many readers will be able to. For example, given the many forms of structuring a linear mixed effects model, in my opinion, the following statement is only slightly more informative than saying that they used “statistics”:

The dependencies of calcification, extension, and density of annual growth bands on year (the year the band was laid down), location (the relative distance of the reef across the shelf and along the GBR), and SST were assessed with linear mixed-effects models (16) [supporting online material (SOM)].

Nor does a statement like the following give usable information on how they actually did the calculation:

Fixed effects were represented as smooth splines, with the degree of smoothness being determined by cross-validation (17 – Wahba, 1983).

If the Supplementary Information laid things out, but it doesn;t. lmer has a huge number of options; without a script, one is just guessing. I can’t imagine that there will be many readers of this article that are both interested in the subject matter and as familiar with the tools as I am. I couldn’t figure out how their model was specified from their material and don’t have enough present interest in the topic to try to reverse engineer it.

In any event, regardless of their claim to have “cross-validated” the smoothing, it sure looks to me like a couple of low closing values have been leveraged into a downward trend, when they might simply be a downspike (or even due to limited sampling.) I haven’t vetted the calculation of trend standard errors, but they look to me like OLS trends without being adjusted for autocorrelation (but that’s just a guess). The leverage of a couple of closing values on the illustrated trend is reminiscent of the Emanuel’s bin-and-pin method (which he recanted), in that there seems to be an undue amount of influence on a couple of closing values.

The existence of a positive correlation between calcification and SST resulted in some intriguing contortions. The authors note that “Calcification increases linearly with increasing large-scale SST”, but nonetheless interpret the present results as evidence that “the recent increase in heat stress episode is likely to have contributed to declining coral calcification in the period 1990-2005”. Go figure.

The data seems rather thin as a basis for concluding “unprecedentedness” and surely it would be prudent for worried Australians to take few more coral samples.

Interestingly, we’ve previously encountered coral data from the Great Barrier Reef, as it was one of the proxies in MBH98, where it was held to be “teleconnected” to weather phenomena throughout the world and used as a predictor of NH temperature. You don’t believe me – look at the SI here.

For some reason, the authors have failed to address the important issue of teleconnections between their coral calcification data and NH climate. In particular, using the technical language preferred by the Community, the inverse of the 2004-2005 coral calcification spike is “remarkably similar” to the 2004-2005 spike in Emanual’s Atlantic hurricane PDI. We’ve had difficulty locating proxies for Atlantic hurricane activity and it looks like we may finally have found one. I recommend to the authors that they forthwith join forces with Mann and Rutherford for the purposes of carrying out a RegEM calculation combining Atlantic hurricanes and Great Barrier Reef coral calcification, as it is likely that this will obtain a skilful reconstruction of Atlantic hurricane PDI. (In order to encourage the authors in this long overdue study, I will waive any obligation on their part to acknowledge that they got the idea from Climate Audit.)

UPDATE June 4: As an exercise, I’ve done some mixed effects models of this data using lmer. CA readers, you can do what peer reviewers at leading science tabloids don’t – you can actually do some lmer runs on the data (I don’t for a minute believe that any Science reviewers bothered.) Here are 6 lines of code that load the data, do a lmer run on a model with two fixed effects: age and latitude and extracts a random effect for the year.

library(lme4)
Data=read.csv(“http://data.climateaudit.org/data/coral/GBR.csv”,sep=”\t”,header=TRUE); dim(Data)
Data=Data[Data$year<2006,] #removes a singleton
Data$yearf=factor(Data$year)
(fm1< -lmer(calcification~ lat+ exp(-age)+(1|yearf) ,data=Data))
chron1= ts(ranef(fm1)$yearf,start=min(Data$year) )
ts.plot(chron1)

I’ve done some experiments and show the effect of the above calculation, as well as the effect of not having an aging factor. In effect, this model “adjusts” for varying contributions from latitude and ages. This is not the same model as used by De’ath et al – I don’t know what their model is, but this is a fairly plausible model and one that I’d consider before feeling obliged to use De’ath-type splines.


Figure 3. Random Effects Model for GBR Calcification.

As one reader observed, there is a very limited population of reefs that have values after 1991 and before 1800. In fact, there are only two such reefs: ABR and MYR. To try to preserve a bit of homogeneity in the data set, I did the same procedures using the restricted data set from only these two sites. This yielded the following random effects chronology, plotted here with counts, showing how limited the data is before 1800 and in 2005. Interestingly, this one looks like a classic Mann hockey stick up to 1980, followed by a sharp decline. Maybe they teleconnect with Graybill strip-bark bristlecones, rather than Atlantic hurricanes.

58 Comments

  1. Posted Jun 3, 2009 at 9:18 PM | Permalink

    Fascinated to find at last a critique of De’ath, Lough et al paper in Science

    I had a brief exchnage of emails with Janice Lough in February, but for some inexplicable reason she never replied to my last, extracts follow:

    Dear Janice…Here’s your (with De’ath) Abstract in Science:
    “…We investigated 328 colonies of massive Porites corals from 69 reefs of the Great Barrier Reef (GBR) in Australia. Their [sic, as only 12 reefs had data extending beyond 1990 are archived (as of Feb 09)] skeletal records show that throughout [sic] the GBR [not true, as the only 12 reefs with post-1990 data are all between 18 and 22oS.], calcification has declined by 14.2% since 1990 [not true for the 69 reefs, because only 12 reefs’ data extend beyond 1990], predominantly because extension (linear growth) has declined by 13.3%. The data suggest that such a severe and sudden decline in calcification is unprecedented in at least the past 400 years [not true, as only one of the 12 archived reefs with post-1990 data has a life extending for 400 years]”.

    “So yes, your joint paper does state in its Abstract that its results come from 400-year data for 328 colonies in 69 reefs, and it repeats this claim in the text, p.119, final para: ‘..our data show that growth and calcification of massive Porites in the GBR are already declining and are doing so at a rate unprecedented in coral records reaching back 400 years’. When only one of the archived data sets with data post-1990 does reach back 400 years this is a gross exaggeration.”

    “I really do wonder if scientists now have any basis for being considered to have greater integrity than one finds amongst investment bankers, but I do know from experience that your paper would not pass muster with the ASX as a prospectus for a share issue. That is why I hope you and you co-authors will write to Science correcting these misleading statements, as I would prefer not to have to”. Can’t think why I never got a reply!

    It is also worth asking why the archived data sets end in 2001 at latest when De’ath et al say they sampled the 13 (sic) reefs in 2005? The reason is obvious, as in 2001 the GBR was still recovering from the 1998 El Nino bleaching event, while by 2005 it had almost fully recovered, so by then they would have had no evidence at all. Also to be noted is the paper’s curious method of doing growth rates, they seem to have avoided using log-linear, which for the 12 archived data sets extending beyond 1990 show 6 reefs with greater calcification etc and six with less.

    • Posted Jun 3, 2009 at 9:24 PM | Permalink

      Re: Tim Curtin (#1),

      They all go to the same school. Great letter.

      The first graph looks like they don’t know how to handle the endpoint of a heavy filter – or maybe they do.

  2. Steve McIntyre
    Posted Jun 3, 2009 at 9:25 PM | Permalink

    I don;t agree with some of your assertions here. I don’t see any statement that they sampled 13 reefs in 2005. I received data (by email from an Australian) that included 2005 data. So I don’t think that you’re right on this count. (I’ve placed this data online.) If you criticize people, it’s very important not to make mistakes as any mistakes are always used to discredit or ignore correct points – I’d urge you to make sure that each of your points is correct.

  3. Posted Jun 3, 2009 at 9:48 PM | Permalink

    I think the endpoint drop is created by parametric splines. I’ve only worked with them for graphics but the endpoint was always a knot.

    • Steve McIntyre
      Posted Jun 3, 2009 at 9:55 PM | Permalink

      Re: Jeff Id (#4),

      It was created using splines. They say this and the SI has a couple of lines of code that indicate this, but the lines drop out of outer space. Author De’ath has written on “principal curves” – which I take it to be principal components on functions, but I haven’t studied the topic (nor do I plan to in the near future).

  4. Andrew
    Posted Jun 3, 2009 at 10:43 PM | Permalink

    Haha! Now this is good for a laugh! Great work dissecting this one-the methodological vacuity seems astonishing, but then again, hey…

    By the way, any indication what those gray/blue bounds are supposed to be? They look suspiciously familiar apart from the obvious reversed nature.

  5. Posted Jun 3, 2009 at 11:20 PM | Permalink

    Steve; I accessed the De’ath-Lough et al archived data sets for their paper on or about 12th January 2009 at:
    http://www.ncdc.noaa.gov/paleo/

    I downloaded into Excel. All I could find was one sampled reef (out of the claimed 69 sampled in 2005) with a 400 year history to 2001. Only 12 of the 69 as of 12 January 2009 had data beyond 1990, and none beyond 2001. Truly they are of the MBH school!

    Have they archived some more, perhaps in response to me emails to De’ath & Lough? Perish the thought!

    I agree re correct citation etc. You have mine, their paper’s claim that 69 reefs have data that go back 400 years, and the link to their archive showing only one as of January 2009.

  6. JS
    Posted Jun 3, 2009 at 11:57 PM | Permalink

    There may be some confusion about data going on.

    The data that Tim seems to be talking about and seems to be the most relevant data at the ncdc/paleo website is ftp://ftp.ncdc.noaa.gov/pub/data/paleo/coral/west_pacific/great_barrier/aims60core.xls. It ends in 2001. This data is different to that which you have Steve (if only because of the end date). I have not investigated whether the data you have is obtainable through a different link or what the commonality between these two is and so on and so on.

  7. O Weinzierl
    Posted Jun 4, 2009 at 1:57 AM | Permalink

    From my point of view especially the sharp decline of samples used for interpretation should be criticised. Only a view samples are used to find the recent calcification rate, so that rate might not be representative of the real state of whats going on in the reefs.

  8. braddles
    Posted Jun 4, 2009 at 4:47 AM | Permalink

    On a point of order, they didn’t claim that the recent levels were the lowest for 400 years, but the that the decline (change in levels) was unprecedented.

    • Steve McIntyre
      Posted Jun 4, 2009 at 5:09 AM | Permalink

      Re: braddles (#12), I don’t think that this nuance matters – the downspike in 1852 looks like the same order of magnitude.

  9. Posted Jun 4, 2009 at 5:24 AM | Permalink

    Steve: I am surprised at you. I have checked every site in your link, and they match exactly what I found on or about 13 January 2009, so you are quite wrong when you say at #8 “I don’t think that there is currently an issue on data”. You have NOT checked. In fact only 12 of your listed sites has data beyond 1991, and only one of those goes back 400 years, contrary to the 69 sites claimed by De’ath Lough et al in Science. NONE of their sites as reported by you has data beyond 2001, yet they claimed to have sampled all 69 in 2005 and that ALL had 400 years of data across the length and breadth of the GBR, when in fact only one did, and both it and the other 11 with data beyond 1990 are in a narrow band of the southerly GBR. AIMS Townsville and its associates like the Guldbergs have a long history of hockey stick data mining, I am astonished you defend them.

    – snip. I ask people to be polite here and to avoid adjectives as much as possible.

    • Steve McIntyre
      Posted Jun 4, 2009 at 7:03 AM | Permalink

      Re: Tim Curtin (#14),

      Tim, I’m not saying that the data is archived properly – I said that I hadn’t verified the email version that I received against NCDC. That climate scientist should have an inadequate archive is hardly news here.

      Unfortunately, you always have to consider the possibility that they haven’t archived the “right” data before drawing too strenuous a conclusion. I haven’t parsed the NCDC data, but you may well be right about that version. That doesn’t mean that there isn’t 2005 in another version – the one that I have.

      Yes, I agree that the “right” data should be archived. I’ve made that case about as vigorously as anyone can wish. But that doesn’t mean that there isn’t any 2005 data.

  10. RomanM
    Posted Jun 4, 2009 at 5:32 AM | Permalink

    The link for the csv file is seems to be garbled on my browser.

    • Bob Koss
      Posted Jun 4, 2009 at 7:34 AM | Permalink

      Re: RomanM (#15), the .csv extension usually indicates comma delimited, but not always. The columns are tab delimited. Maybe your browser is expecting commas?

      I was able to load it right into Open Office Calc which allowed me to select tab for the column delimiter. Another way that worked for me is to save it to disk. Load it into a text editor, then copy/paste it into a spreadsheet using the tab delimiter.

      • Steve McIntyre
        Posted Jun 4, 2009 at 7:56 AM | Permalink

        Re: Bob Koss (#23),

        If I hear that CA readers are using spreadsheets instead of R, I’ll stop placing csv files online. 🙂

        The csv files are not there for CA readers from whom I expect more; they are there for visitors.

        • Bob Koss
          Posted Jun 4, 2009 at 8:15 AM | Permalink

          Re: Steve McIntyre (#25), Heh. 🙂
          Old-timers disease has been setting in for a few years now. No new tricks for this old dog. I find more frequent mistakes creeping into the old tricks these days. I only do it for some mental exercise in an effort to delay brain atrophy as much as possible.

      • RomanM
        Posted Jun 4, 2009 at 8:00 AM | Permalink

        Re: Bob Koss (#23),

        Bob, the problem was not “reading” the data, but “finding” it. The “link” in the original post does NOT show the file location at all because it seems to be maltyped. If you notice, I did manage to locate it and I put a link to the file in comment 15. PaulM had the same problem finding it in comment 20.

        Yes, the data is tab separated. My copy of Excel figured that out and separated it using “text into columns”.

        • Bob Koss
          Posted Jun 4, 2009 at 8:30 AM | Permalink

          Re: RomanM (#26), I just found the broken link to which you were referring. I misunderstood. I’ll assume Excel is some version of R. Since that’s what is used around here. 🙂

        • Steve McIntyre
          Posted Jun 4, 2009 at 8:35 AM | Permalink

          Re: RomanM (#26),

          Roman, sorry about the bad link. Fixed.

  11. Steve McIntyre
    Posted Jun 4, 2009 at 5:34 AM | Permalink

    For readers interested in getting a foothold on lmer, here;s an interesting post that not many people paid attention to in which I did a rather pretty emulation of Hansen’s “reference method” for amalgamating gridcells (his Step 3) in a couple of lines using lmer.

    In another post, I discussed the use of the related program nlme to make “conventional” tree ring chronologies and had promised to show how to do RCS chronologies using mixed effects models, but didn’t.

    There are a lot of parallels between tree ring chronologies and calcification time series. De’ath et al refer to “ontogeny” as a complicating factor – dendro detrending for age is their version of dealing with ontogeny.

    In that respect, it would be interest to do experiment with my dendro chronology programs using calcification data and see what happens.

  12. Bob D
    Posted Jun 4, 2009 at 6:17 AM | Permalink

    There does seem to be an issue with the data as published on the NOAA website – perhaps they archived the wrong data. However, it seems likely the data sent independently to Steve is the correct data as used in the paper, since the graphs clearly show 2005.

    I was having a play with Steve’s data, and I confess I would battle to draw strong conclusions about recent trends from these data sets, simply because they peter out dramatically after about 1991. Here is another graph showing this:

    The data from each site shows this as well (possibly the world’s ugliest graph):

    There is also, by the way, a slight trend by latitude, reinforcing the “higher calcification rate in warmer waters” conclusion.

    • Steve McIntyre
      Posted Jun 4, 2009 at 6:56 AM | Permalink

      Re: Bob D (#17),

      Bob, that looks a lot like a tree ring graph. I posted up graphs that looked like this when we were doing more tree ring stuff a couple of years ago.

  13. Posted Jun 4, 2009 at 7:27 AM | Permalink

    Steve, please could you fix the link to your csv file and clarify exactly where you got your data from and how your notation relates to the notation on the NOAA page? There does seem to be some confusion and it looks to me like the NOAA data only goes to 2001. Perhaps there is more data somewhere else? It seems clear that you have all the data they used, since the size of your file matches the 16472 figure given in the SI.

    Not for the first time, one wonders why, if this is such a crucial indicator of climate change, they have stopped recording the data. But like you I’m not a climate scientist and I’m sure there is an excellent reason for it that I am too dim to understand.

    • Steve McIntyre
      Posted Jun 4, 2009 at 7:49 AM | Permalink

      Re: PaulM (#20),

      The command below works fine for me.

      Data=read.csv(“http://data.climateaudit.org/data/coral/GBR.csv”,sep=”\t”,header=TRUE)

      I collated the data from 9 different csv files sent to me from an Australian scientist, who can identify himself if he wishes. I think that he got the data directly from the authors and so it may not reconcile with the NOAA version. It would be annoying if there are differnt versions of the data floating around, but this would not be, shall we say, “unprecedented” in climate science.

      De’ath et al say:

      The composite data set contains 16,472 annual records, with corals ranging from 10 to 436 years in age, most of which were collected in two periods covering 1983–1992 and 2002–2005.

      To check my collation, I’ve got exactly 16472 rows in my collation, so we’re apples and apples here.

      dim(Data) #16472 9

      Not that everything in the above statement matches. I calculated the ages reached by each coral – this is the sort of thing that is ridiculously easy to do in R:

      range(tapply(Data$age,Data$id,max) ) # [1] 6 416

      So the range in the dataset that I have is from 6 to 416 years, as opposed to the 10 to 436 years reported in the article. Is this because there’s still another data version floating around or because of careless errors in the Science article inserted to baffle the unwashed. Dunno. There are other similar small inconsistencies.

  14. Posted Jun 4, 2009 at 7:31 AM | Permalink

    For some reason, the authors have failed to address the important issue of teleconnections between their coral calcification data and NH climate. In particular, using the technical language preferred by the Community, the inverse of the 2004-2005 coral calcification spike is “remarkably similar” to the 2004-2005 spike in Emanual’s Atlantic hurricane PDI. We’ve had difficulty locating proxies for Atlantic hurricane activity and it looks like we may finally have found one. I recommend to the authors that they forthwith join forces with Mann and Rutherford for the purposes of carrying out a RegEM calculation combining Atlantic hurricanes and Great Barrier Reef coral calcification, as it is likely that this will obtain a skilful reconstruction of Atlantic hurricane PDI. (In order to encourage the authors in this long overdue study, I will waive any obligation on their part to acknowledge that they got the idea from Climate Audit.)

    Umm … Ouch!

    Not related to anything in particular, but I was wondering how much time you spend on these studies actually waiting for calculations to finish. You made a comment (almost in passing), in the previous thread I think, that you didn’t have enough computing power to run some calculations. If computing power is an issue with some of this stuff, would some sort of distributed computing setup be of some help? Something like SETI@Home? I don’t know a whole lot about the math behind what you’re doing (in spite of the BA I hold in Math and Physics from a couple of decades ago!), but maybe there’s something we can do to distribute the load on some of the more computationally intensive work being done.

    Just a thought. From what I can tell, quite a bit of time is spent simply digging through the data itself, and parsing what the claims are, so the computation time may be a distant secondary consideration.

    Keep up the good work. I’ve referred several AGW advocates to your site for both your summaries and some of the specific take-downs that you’ve penned. You have opened some eyes. Thanks for the work you do.

  15. Steve McIntyre
    Posted Jun 4, 2009 at 7:33 AM | Permalink

    I noticed that, while I’ve mentioned using lmer to make tree ring chronologies, I don’t seem to have posted up a method before.

    I can replicate “conventional” tree ring chronologies almost exactly using nlme – see http://www.climateaudit.org/?p=2214. RCS is actually an even simpler model than the “conventional” method as it can be modeled with a simple nls model. Benchmarking RCS is not easy as to my knowledge (and this was the case about a year ago), there are no sites where a dendro has archived both his measurements and his RCS chronology.

    Anyway, here’s how you can do a tree ring chronology using lmer – a program that I happen to like. I’ve loaded Gaspe data as an example – I had this on hand for experiments because it was a site that interested me for MBH, though I suspect the differences between a mixed effects chronology and a conventional chronology are greater for this dataset than most cases.

    I’ve placed the measurement data and official chronology online as organized R objects. So start by loading the package containing lmer (lme4) and loading the data – you have to install lme4 first.

    library(lme4)
    download.file(“http://data.climateaudit.org/data/tree/cana036.rwl.tab”,”temp.dat”,mode=”wb”);load(“temp.dat”);
    dim(tree) #11919 4
    download.file(“http://data.climateaudit.org/data/tree/cana036.crn.tab”,”temp.dat”,mode=”wb”);load(“temp.dat”);
    tsp(chron.crn) # 1404 1982

    In order to use lmer to make tree ring chronologies, you have to make a new item in which you make the year a factor – otherwise the program treats the year as a number,

    tree$yearf< -factor(tree$year) #set up for lmer

    Here is a form of mixed effects chronology in which the aging factor is a “fixed effect”, while the average ring width between cores is a “mixed effect”. The latter is a way of standardizing ring widths. The lmer model is executed as follows:

    fm0< -lmer(rw~ exp(-age)+(1|id)+(1|yearf),data=tree)

    You can pull a “chronology” from the random effects for yearf as follows:

    chron.lmer0= ts(ranef(fm0)$yearf,start=min(tree$year) )

    Now plot the two series together and you get the following comparison. The mixed effects chronology is recognizably from the same data set, but has a “low frequency” difference. In this case, I’ve done sort of a cross between RCS and conventional chronology, by using a fixed effect for the negative exponential aging and a mixed effects for the levels of the individual cores. There are many perms and combinations and it would be nice to spend some time on this .

  16. David P
    Posted Jun 4, 2009 at 8:58 AM | Permalink

    snip – too obvious

  17. Steve McIntyre
    Posted Jun 4, 2009 at 11:52 AM | Permalink

    The MBH98 SI is here. Those who disbelieve that MAnn used GBR corals as a proxy for NH temperature can verify for themselves. Inquiring minds want to know why – if corals can teleconnect to NH temperature, why can’t they teleconnect to Atlantic hurricanes? If teleconnectors cavil at the latter, why? I’d sure like to find some teleconnection that they agree to exclude so that teleconnection standards can be applied to other proxies.

    • Gerald Machnee
      Posted Jun 4, 2009 at 10:17 PM | Permalink

      Re: Steve McIntyre (#32),

      The MBH98 SI is here. Those who disbelieve that MAnn used GBR corals as a proxy for NH temperature can verify for themselves.

      That is the line where it says – Lough , personal communication.

  18. Robnet Kerns
    Posted Jun 4, 2009 at 12:51 PM | Permalink

    I’ve been playing a little with their data in R.

    It really looks like their entire result is an artifact of going from ~25 sites for n hundred years, to about 10 for the 1990’s, to two (2). Two sites for the most important years of the study! Did they provide any rationale for this, other than cherry-picking the data? In my opinion the study is not valid past 1985 or so, and this not withstanding the obvious disconnect between the smoothing and raw plots.

    Fortunately, it really looks like calcification has been increasing overall for the last 400 years, which means the GBR is in good shape.

    • Steve McIntyre
      Posted Jun 4, 2009 at 1:05 PM | Permalink

      Re: Robnet Kerns (#33),

      they probably didn’t collect data in 2005 from more than one site.

      It’s amazing to see how much more data was collected prior to climate change and IPCC becoming an issue. The collections in the 1980s are very extensive, while the recent collections are sparse. As always, bring the proxies up to date.

      • Geoff Sherrington
        Posted Jun 6, 2009 at 1:01 AM | Permalink

        Re: Steve McIntyre (#35),

        Steve, The Great Barrier Reef was inscribed on the World Heritage list in 1981, just 7 years after the UN Convention started. There was an intense burst of work to make the case, as there were simultaneous proposals for drilling for oil and gas in the vicinity. There have been additional conditions placed on the region over the years, but most people here consider it is locked up now for all time. That might explain the sudden cessation of results about 1980 shown on the graph of Bob D at 17. I’m surprised that coral was allowed to be taken in recent years. That’s very much a no-no.

        Consequently, we locals do not take much scientific or political interest in the area any more, so tall stories can easily fall through the cracks. Objectively, it is one of the modern wonders of the world and we respect it as such.

        However, there is an intermittent stream of scare stories, some of which have been disproven or put in the too-hard basket e.g.

        Outbreaks of crown-of-thorns starfish Acanthaster planci
        have been a major issue on the Great Barrier Reef and
        other Indo-Pacific reefs for nearly 40 years. The outbreaks
        have generated great concern (GHS note: This was before UNPRECEDENTED gained traction) among the community
        and considerable debate among scientists.

        Click to access COTS_web_Nov2003.pdf

        There has also been consternation about fertilizer run-off from the extensive sugar cane plantations along the coast.

        My more learned colleagues note that the natural rise and fall of the tides, the seasonal changes of temperatures, the abundance of old predator-prey cycles, the ability of some corals to relocate, constancy of climate etc., are factors of far greater impact that a possible tiny rise in SST. There is abundant lack of concern. The Reef is not considered endangered by many who might know. I don’t know. I’m just an onlooker who enjoys the fish. It’s a place to go to get away from alarmist Global Warming talk.

  19. Steve McIntyre
    Posted Jun 4, 2009 at 1:03 PM | Permalink

    I’ve updated the post to show two random effects plots using lmer, including a 6-line script that enables readers, unlike reviewers at science tabloids, to do their very own turnkey lmer analysis of the coral data.

  20. Ryan O
    Posted Jun 4, 2009 at 1:18 PM | Permalink

    Their confidence intervals cannot be correct. By extracting all 2005-2006 samples from only two sites – yet using the number of samples to determine their degrees of freedom – they violate the requirement that the samples be randomly selected from the population. Although I freely admit ignorance of the actual number of coral reefs, I would bet lots and lots of money that there are more than two. If the number of sites is constantly changing, a simple solution would be to define a “sample” at time t as the site average at time t and base the degrees of freedom on the number of sites. The intervals should get drastically larger as the number of sites decreases to two.

  21. Craig Loehle
    Posted Jun 4, 2009 at 2:02 PM | Permalink

    Any time series with smoothing that goes right up to the last year of the data (2005) is suspect unless they use a backward looking average of some sort, which is not the case here.

  22. Willem de Lange
    Posted Jun 4, 2009 at 3:28 PM | Permalink

    Another paper in Science also presents data for calcification rates, although it was primarily reconstructing pH trends for the GBR. The paper was
    Pelejero, C., Calvo, E., TmcCulloch, M., Fmarshall, J. and al, e., 2005. Preindustrial to Modern Interdecadal Variability in Coral Reef pH. Science, 309(5744): 2204-2204.

    The second figure from their paper shows that pH (and to a degree calcification) varies over time, and they linked it to the Pacific Decadal Oscillation though changes in wind. There are also changes in rainfall and storminess, and hence flood frequency (viz. Nott, J., Haig, J., Neil, H. and Gillieson, D., 2007. Greater frequency variability of landfalling tropical cyclones at centennial compared to seasonal and decadal scales. Earth and Planetary Science Letters, 255(3-4): 367-372.).

    It would seem premature to claim there was a trend with such limited data.

  23. KevinUK
    Posted Jun 4, 2009 at 3:35 PM | Permalink

    Steve,

    I think its time for another Starbuck’s challenge. Anyone up for donations to fund a trip for Steve to Queensland?

    As JEB at Numberwatch always points out, if you are a pharmaceutic company doing a trials test of your new super (profitable) drug it’s always best to terminate you’re trial as soon as you get the result you want. As climate scientists already know there is little point in updating the proxies because they already have the result they want. Far better as management consultants throughout the world know to just re-hash the same old number with further extrapolation additional more alarming spin.

    KevinUK

  24. KevinUK
    Posted Jun 4, 2009 at 3:38 PM | Permalink

    Steve,

    I think its time for another Starbuck’s challenge. Anyone up for donations to fund a trip for Steve to Queensland?

    As JEB at Numberwatch always points out, if you are a pharmaceutic company doing a trials test of your new super (profitable) drug it’s always best to terminate you’re trial as soon as you get the result you want. As climate scientists already know there is little point in updating the proxies because they already have the result they want. Far better as management consultants throughout the world know to just re-hash the same old numbers with further extrapolation and additional more alarming spin.

    KevinUK

  25. Robnet Kerns
    Posted Jun 4, 2009 at 4:03 PM | Permalink

    It should be “Unprecedented” in the past 40 Years.

    Only two sites compose the 2005 data points, MYR (1667-2005) and WHI (1950-2005). If you plot the calcification averages of MYR and WHI seperate from the rest of the data it is clear that 1) both sites have consistently lower calcification averages than the rest of the data used; and 2) the 2005 averages for these two sites are maybe the lowest since 1965 or so, but not even close to their lowest historic calcification averages.

    For example, nearly all of MYR from 1950 – 1970 had lower average calcification measurements than 2005. Clearly the inverted hockey stick is the result of using measurements from only these two sites with historically low calcification relative to the averaged measurements of 10-20 sites.

    Unprecendented, indeed.

  26. Steve McIntyre
    Posted Jun 4, 2009 at 5:04 PM | Permalink

    I’d already noted that one site constitutes nearly all the 2005 measurements. 2004 were also low and any sensible data analyst would cross examine 2004. But calcification in 2000-2003 wasn’t especially low. And one year (2004) does not justify the claims of the paper.

    It’s pretty amazing even by Community standards.

  27. Louis Hissink
    Posted Jun 5, 2009 at 5:59 AM | Permalink

    Steve,

    original data – you collated it as a tab delim but saved it as a csv.

    Ahem, tab delims are not csv’s.

    :-0

    • Steve McIntyre
      Posted Jun 5, 2009 at 6:35 AM | Permalink

      Re: Louis Hissink (#45),

      I use tab separated files. R doesn’t care what you call a file. Open Office, which I occasionally use, will open tab-separated csv’s. My Excel won’t work anymore and I don’t have a reason to get it working.

      • Kenneth Fritsch
        Posted Jun 5, 2009 at 2:46 PM | Permalink

        Re: Steve McIntyre (#46),

        I use tab separated files. R doesn’t care what you call a file. Open Office, which I occasionally use, will open tab-separated csv’s. My Excel won’t work anymore and I don’t have a reason to get it working.

        Abandoned like a scorned lover or maybe you never had the feelings some of us had for Excel before our enlightenments. I was going to ask if you ever use Excel anymore and it looks like I got my answer.

        I have read online comments about there being a place for Excel to be occasionally used by those who are experts in using R. I ask a question sometime ago about commands in R that can manipulate data structures as efficiently as Pivot Table does in Excel. Obviously I do not expect a GUI wizard, but just a few commands that do what Pivot Table does. Obviously the stack and unstack commands can do some of the simple stuff, but I was attempting to use melt and cast (reshape) to manipulate data structures and never got it to do what I wanted.

  28. Mark T
    Posted Jun 5, 2009 at 12:34 PM | Permalink

    Excel doesn’t care what the file extension is either, other than as a first guess as to what type of file it may be. CSV technically means “comma separated values,” but any decent spreadsheet program can easily figure out what the true delimiter is, and if not, they typically offer the option to pick a delimiter. As long as the delimiter is consistent, MATLAB will pull them in as one long vector unless there are semi-colons, which then requires that there are the same number of values in-between each semi-colon.

    Yay, formatting fun.

    Mark

  29. a jones
    Posted Jun 5, 2009 at 1:18 PM | Permalink

    As I recall the Idso’s at CO2 science did a rather interesting critique of this paper some time ago. I assume you can still get it on their web site.

    Kindest Regards

    • Geoff
      Posted Aug 23, 2009 at 12:04 AM | Permalink

      Re: a jones (#48), You are right. Their comment on this study (which is basically just advising caution about over-extrapolation of the results, which seems astute in view of the issues that have been raised in this thread) is here. They have a documented commentary on coral growth and recovery which is here.

      The paper they cite on rapid recovery of corals in the GBR (Doom and Boom on a Resilient Reef: Climate Change, Algal Overgrowth and Coral Recovery, Diaz-Pulido et. al., 2009, PLOS One, here) concludes

      In summary, unusually rapid coral recovery in the Keppel Islands apparently stemmed from synergistic effects of factors not previously recognized as important to resilience. These factors included robust tissue regeneration, high competitive ability of the corals and a seasonal dieback in the monospecific seaweed bloom, all against a backdrop of an effective marine protected area system and moderate water quality. Understanding the variability in mechanisms underlying resilience is critical for reef management under climate change. Settlement and recruitment of new corals requires years to decades to re-establish abundant corals, whereas recovery in the Keppel Islands took less than one year. Frequent, large-scale damage may mean that reefs able to rapidly recover abundant corals may serve as key refugia, or sources of larvae for reef recovery at broader scales. Diversity in processes may well be critical to the overall resilience and persistence of coral reef ecosystems globally.

      The smoothing and endpoint issues certainly need to be resolved, and hopefully will be when the code is released, but new data seems to also indicate there are processes not yet well understood, and over-extrapolation from short time series has pitfalls.

  30. Posted Jun 5, 2009 at 2:38 PM | Permalink

    What are the light and dark gray bands supposed to represent? In the middle they look like maybe 50% and 95% CI’s. It’s natural for a cubic spline CI to “trumpet” out at the ends as the inner one does. But then why does the outer one meet or even approach it at the ends? The ratio of their widths should be constant.

    The first and last cubic segment of an unconstrained cubic spline has to be trying to head for plus or minus infinity if extended outside the data set, and often will start to do this even before the end of the data, which looks somewhat like what’s happening. I’ve seen similar effects many times in spline curvefits of the term structure of interest rates, though the standard errors are usually a clue not to get excited about the erratic behavior.

    • Kenneth Fritsch
      Posted Jun 5, 2009 at 2:51 PM | Permalink

      Re: Hu McCulloch (#49),

      I’ve seen similar effects many times in spline curvefits of the term structure of interest rates, though the standard errors are usually a clue not to get excited about the erratic behavior.

      After the current US Fed actions, those spline curvefits to infinity might not be an artifact.

  31. Louis Hissink
    Posted Jun 5, 2009 at 5:47 PM | Permalink

    #46
    Steve

    I managed to get the data in by renaming it *.txt – everything then read in ok. Excel then also worked. Mind you the data was read in initially but as a rather long string which ended up in one cell.

    Minor issue not worth further discussion.

    Much more fun watching the ramifications of Billiton and Rio joing forces in the iron ore business. Chinese are not amused.

  32. Ed Snack
    Posted Jun 5, 2009 at 9:04 PM | Permalink

    Louis, “csv” stands for “character separated file, not “comma separated file”, so tab delimited files are indeed “csv’s”. Note that the standard Unix delimter is the “|” symbol, which is usually a very good choice as “|” is very rarely used otherwise. I have seen the chaos a carelessly placed comma can have in downloaded data if one is foolish enough to use commas as delimiters.

  33. Ed Snack
    Posted Jun 6, 2009 at 10:43 PM | Permalink

    Paul, mea culpa, I’m obviously wrong. I’m from an IBM Unix background where “csv” was always using the “|” character in my experience. I first met commas in DOS programs and assumed it was a DOS foible. I see I have been misled. Still seems a daft idea, commas are all too common in normal data.

  34. Posted Jun 8, 2009 at 7:59 AM | Permalink

    There are a couple of typos in your script, which I realise were probably left in deliberately to check whether any of your readers actually tried it out.

    library(lme4)
    Data=read.csv(“http://data.climateaudit.org/data/coral/GBR.csv”,sep=”\t”,header=TRUE); dim(Data)
    Data=Data[Data$year<2006,] #removes a singleton
    Data$yearf=factor(Data$year)
    (fm1 <- lmer(calcification~ lat+ exp(-age)+(1|yearf) ,data=Data))
    chron1= ts(ranef(fm1)$yearf,start=min(Data$year) )
    ts.plot(chron1)

    Here is a corrected version. Is there a prize?

  35. Peter Ridd
    Posted Jun 9, 2009 at 10:30 PM | Permalink

    I am happy to be identified as the one who sent Steve the data which was originally sent to me by De’ath . We wrote a comment to Science which was rejected as expected. Steve has both our comment (short and long version) and De’aths reply so maybe these should go on this blog.

    I think that the main problem is that the 2004/5 data has some measurement problems with the last ring of the cores.

    I am presently trying to get the original R code from De’ath but he has so-far ignored my email. It is possible he is away at the moment. Fortunately there are some good people down at AIMS and I have not doubt we will eventually get the code.

    Peter Ridd
    Physics
    JCU Townsville

  36. stephen richards
    Posted Aug 24, 2009 at 4:04 AM | Permalink

    C.S.V

    Comma.separated.Values

8 Trackbacks

  1. By Dagens länkar | The Climate Scam on Jun 4, 2009 at 1:53 AM

    […] "Unprecedented" in the past 153 Years- Steve McIntyre om studien som visar klimatförändringens effekt på Stora Barriärrevet. Lite extra kul är att en av författarna heter De’ath. […]

  2. […] “Unprecedented” in the past 153 Years, by Steve McIntyre on June 3rd, 2009 http://www.climateaudit.org/?p=6189   4. The Ocean Acidification Fiction, Volume 12, Number 22: 3 June 2009 […]

  3. By "Unprecedented" in the past 153 Years on Jun 12, 2009 at 9:24 PM

    […] scary”. And so on. An occasional Australian reader brought the data set to my attention t click for more var gaJsHost = ((“https:” == document.location.protocol) ? “https://ssl.” : […]

  4. […] and code should eliminate some of the work we see in climate science. The coral reef paper on CA HERE is likely a good example of something which may not make the […]

  5. By A Little fun with RC « the Air Vent on Sep 3, 2009 at 9:22 AM

    […] sanctioned methods for throwing out data or manipulated filter projections to make things appear worse than we thought.  So he does have a good point.  So today I’m forced to come clean with the equations I […]

  6. […] back to today’s story. A few day’s ago, I posted on coral calcification, an issue to which I had been referred by Peter Ridd of James Cook University in Australia. […]

  7. […] Coral Calcification due to Global Warming Global warming researcher Steven McIntyre has published an article on his website ClimateAudit.org showing the results of his examination of a widely publicized study by Glenn De’ath et al which […]

  8. […] to grow their shells. A study of corals at the Great Barrier Reef shows that shell calcium growth rates today are about 25 percent higher than 300‒400 years ago when both ocean temperatures and levels […]