HadCRU3 versus GISS

I made a subset of HadCRU3 to cover the continental U.S. and compared it to the USHCN-2000 version.

Here’s the cells that I included in the U.S. subset (a total of 38 gridcells).
uhcnh8.gif

Next here is a comparison of the USHCN 2000 version (digital data downloaded from John Daly, but matched to figure in contemporary press release) to CRU. The CRU comes as monthly data – I made an annual average and then zero-ed on 1951-1980 to match GISS 1951-1980 centering. (In passing, CRU is supposed to have 1961-1990 centering, but I couldn’t replicate this at all!) The blue histogram shows the difference between HadCRU3 and GISS 2000. The maximum negative delta in the 1930s is -0.35 deg C and the maximum positive delta is 0.28 deg C. In HadCRU3, the record year remains 1934.
uhcnh9.gif

So as between CRU and Hansen’s crowd, it looks like Hansen’s crowd is adjusting things more. It would be nice to see what CRU is doing differently. But hey, they’re the Team and I guess we’ll never know.

46 Comments

  1. Steve McIntyre
    Posted Feb 17, 2007 at 10:47 AM | Permalink

    Here’s the code to produce these graphics. Requires prior downloading of HadCRU3 discussed elsewhere.
    ##USA MASK
    library(fields)
    ustest<-cbind( c(t( array(rep( seq(27.5,47.5,5),14) ,dim=c(5,14)) )), rep( seq(-127.5,-62.5,5),5) )

    i<- 37+(test[,2]-2.5)/5;j<- 19+(test[,1]-2.5)/5

    plot(xy.coords(ustest[,2],ustest[,1]),pch=".",type="p",ylim=c(25,52))
    world(xlim=c(-120,-40),ylim=c(30,55),add=TRUE)
    US(add=TRUE)
    temp<-rep(TRUE,nrow(ustest))

    temp[c(1:6,8:9,11:14)]<-FALSE
    temp[14+c(1:2,11:14)]<-FALSE
    temp[28+c(1,12:14)]<-FALSE
    temp[42+c(1,13:14)]<-FALSE
    temp[56+c(1,9:14)]<-FALSE
    points(xy.coords(ustest[temp,2],ustest[temp,1]),col="red",pch=19)
    sum(temp)

    ##JOHN DALY VERSION FROM 2000
    v00<-read.table("http://www.john-daly.com/usatemps.006&quot;,skip=7,fill=TRUE,nrow=1999-1880)
    v00<-ts(v00[,2],start=1881)
    mean(v00[(1951:1980)-1880]) #[1] -4.419479e-18
    #so this is zero-ed on 1951-1980; it's also in deg C

    ##LOAD DATA SET
    library(ncdf)
    url<-"d:/climate/data/jones"
    loc<-file.path(url,"HadCRUT3.nc" ) #[1] "c"
    v<-open.ncdf(loc)
    instr <- get.var.ncdf( v, v$var[[1]]) # 1856 2005
    dim(instr)# [1] 72 36 1883
    #this is organized in 72 longitudes from -177.5 to 177.5 and 36 latitudes from -87.5 to 87.5

    K<-sum(temp)
    test<- array(NA,dim=c(1883,K))
    for (k in 1:K) {
    test[,k]<-instr[i[temp][k],j[temp][k],]
    }
    dim(test)# 1883 38
    us.cru<-apply(test,1,mean,na.rm=TRUE)
    us.cru<- array(c(us.cru,NA),dim=c(12,1884/12) )
    annual<-ts(apply(us.cru,2,mean,na.rm=TRUE) ,start=1850)
    mean(annual[(1951:1980)-1849]) #[1] -0.05615611
    mean(annual[(1961:1990)-1849]) #[1] -0.05989007

    annual<-annual-mean(annual[(1951:1980)-1849])
    ts.plot(annual)
    lines(1881:1999,v00,col="red")
    lines(1881:1999,annual[(1881:1999)-1849]-v00,type="h",lwd=2,col="blue")

    range(annual[(1881:1999)-1849]-v00) #[1] -0.3513508 0.2687864

  2. Barclay E MacDonald
    Posted Feb 17, 2007 at 11:08 AM | Permalink

    So I’m looking at the maximum negaive and positive variations and see that they total 0.63, which is the total temperature increase alleged for the 20th century. Is it reasonable to draw any inferences or conclusions from this?

  3. John G. Bell
    Posted Feb 17, 2007 at 2:30 PM | Permalink

    It would be nice if we knew the adjustments past 2000 and know how much of the last 10 years warming was their product.

  4. TAC
    Posted Feb 17, 2007 at 5:09 PM | Permalink

    SteveM,

    Do you know if anyone has considered filing a “Request for Correction” under the Data Quality Act?

  5. John Lang
    Posted Feb 17, 2007 at 6:38 PM | Permalink

    It is interesting that Hansen’s GCM model failed to replicate the temperature trends even after he adjusted all the historical records by the most he thought he could get away with.

    Does it seem appropriate that the star who runs the models (that don’t work even though he stakes his star reputation on them) is also the guy who is adjusting the historical temperature records?

  6. Willis Eschenbach
    Posted Feb 18, 2007 at 2:57 AM | Permalink

    Well, I’m on holiday in Fiji, followed by two weeks work in Solomon Islands, so I’ll be in and out. Just wanted to note that I filed a Freedom of Information request for Jones et al.’s HadCRUT data. My request said:

    I would like to obtain a list of the meteorological stations used in the preparation of the HadCRUT3 global temperature average, and the raw data for those stations. I cannot find it anywhere on the web. The lead author for the temperature average is Dr. Phil Jones of the Climate Research Unit.

    Many thanks,

    w.

    I just received a reply saying:

    Dear Mr. Eschenbach

    FREEDOM OF INFORMATION ACT 2000 – INFORMATION REQUEST (FOI_07-04)

    Your request for information received on 28 September now been considered and I can report that the information requested is available on non-UEA websites as detailed below.

    The Global Historical Climatology Network (GHCN-Monthly) page within US National Climate Data Centre website provides one of the two US versions of the global dataset and includes raw station data. This site is at:
    http://www.ncdc.noaa.gov/oa/climate/ghcn-monthly/index.php

    This page is where you can get one of the two US versions of the global dataset, and it appears that the raw station data can be obtained from this site.

    Datasets named ds564.0 and ds570.0 can be found at The Climate & Global Dynamics Division (CGD) page of the Earth and Sun Systems Laboratory (ESSL) at the National Center for Atmospheric Research (NCAR) site at: http://www.cgd.ucar.edu/cas/tn404/

    Between them, these two datasets have the data which the UEA Climate Research Unit (CRU) uses to derive the HadCRUT3 analysis. The latter, NCAR site holds the raw station data (including temperature, but other variables as well). The GHCN would give their set of station data (with adjustments for all the numerous problems).

    They both have a lot more data than the CRU have (in simple station number counts), but the extra are almost entirely within the USA. We have sent all our data to GHCN, so they do, in fact, possess all our data.

    In accordance with S. 17 of the Freedom of Information Act 2000 this letter acts as a Refusal Notice, and the reasons for exemption are as stated below

    Exemption Reason

    s. 21, Information accessible to applicant via other means Some information is publicly available on external websites

    If you have a complaint about the handling of your enquiry then please contact me at:
    University of East Anglia
    Norwich
    NR4 7TJ
    Telephone 0160 393 523
    E-mail foi@uea.ac.uk

    You also have a right of appeal to the Information Commissioner at: Information Commissioner’s Office
    Wycliffe House
    Water Lane
    Wilmslow
    Cheshire
    SK9 5AF
    Telephone: 01625 545 700
    http://www.ico.gov.uk

    Yours sincerely

    David Palmer
    Information Policy Officer
    University of East Anglia

    I’m going to reply, pursuing the list of sites that they actually use … The logic seems a bit tortured, they say they are using the GHCN data, then they say they have sent their data to GHCN … so which is it? Their data, or GHCN’s?

    More to follow,

    w.

  7. Posted Feb 18, 2007 at 3:18 AM | Permalink

    I’m not sure where to put this question to see if anyone can answer.
    I put it in the comments at realclimite.org too. Please forgive me if
    it’s off topic on this thread but it is at least centered about satellite
    data.

    I was browsing the NASA interactive satellite temperature data for the troposphere which stretches from 1979 to 2003. It has a global map that’s colored in shades of red for heating and blue for cooling. Flipping through the time sequence it’s obvious that almost all the heating anomalies are in the snow covered far north. South of Canada down to Antarctica isn’t really heating at all. Moreover, there’s a graph of the average temperature anomalies of all areas (below the world map) and that shows that the net of heating and cooling is just about zero. I was wondering what could account for this pattern of heating and cooling and it occured to me that if the albedo of the snow cover in the far north was declining that would do it. So I looked around and dug up a study of snow albedo that appeared in the Journal of Atmospheric Sciences, Volume 37, August 1980 which confirmed that carbon soot from manmade sources (including forest fires) migrates thousands of miles and accumulates on permanent snow cover causing melting and temperature increases. The antarctic is relatively free of soot buildup but the arctic has been well contaminated.

    This explains the heating and cooling patterns quite well. How does CO2 greenhouse heating explain these patterns and why is the global average temperature not really increasing if a growing C02 greenhouse situation is responsible?

  8. MarkR
    Posted Feb 18, 2007 at 5:28 AM | Permalink

    #4 Wow

    Finally, “objectivity” involves both the presentation and substance of information. Id. at 8459. First, in order for information to be considered objective, it must be presented in an accurate, clear, complete, and unbiased manner. Id. The agency must present the information in the proper context and identify the source (to the extent possible consistent with confidentiality protections) along with the supporting data or models so that the public can assess for itself whether there may be some reason to question the objectivity of the sources. Id. Second, the substance of information disseminated must be accurate, reliable and unbiased. Id. Agencies must identify the sources of the disseminated information, the methods used to produce it, and provide full, accurate, and transparent documentation. 67 F.R. at 8460. Sound statistical research methods must be used to generate original and supporting data and develop analytical results. Id. at 8459. Data subjected to formal, independent, external peer review, is presumed to be of acceptable objectivity, although such a presumption is rebuttable. Id.

    Information that agencies deem to meet OMB’s definition of “influential scientific, financial, or statistical information” also must be reproducible to demonstrate its objectivity. “Influential scientific, financial or statistical information” has a clear and substantial impact on important public policies or important private sector decisions. 67 F.R. at 8460. Agencies that disseminate such information must ensure a high degree of transparency about the data and methods to facilitate its “reproducibility” by qualified third parties. Id. “Reproducibility” means that the information is capable of being substantially reproduced, subject to an acceptable degree of imprecision. Id.

    Surely the Hockey Stick, The GCM’s, the recasting of Temperature data, anything which is published by the US Government bodies concerned, and that includes Federal and State bodies (maybe State run Universities etc), and NASA etc, is subject to these availability of data, transparency, and calculation reproduceability, and lack of bias rule, and all the info and calculation, availability, and corrections can be enforced under this act.

  9. Steve McIntyre
    Posted Feb 18, 2007 at 7:51 AM | Permalink

    Willis:

    In a litigation, any lawyer refusing to produce documents would be eviscerated by a judge if he used the CRU-UEA argument:

    Exemption Reason: s. 21, Information accessible to applicant via other means.
    Some information is publicly available on external websites.

    He says that the list of sites differ, versions differ. You shuold definitely appeal.

    There’s a nugget of information that’s worth pursuing by an American. The CRU official says that Jones’ CRU collection was sent to NCAR, which is a U.S. federal institution. I think that it would make sense to start a parallel process to the one in England, by making a FOI request to NCAR in the U.S. for the data as supplied by CRU.

    It would also make sense while these files are open to try to get at the HadCRUT2 and even HAdCRUT1 versions, since I suspect that these will prove relevant to following their adjustments.

  10. Willis Eschenbach
    Posted Feb 18, 2007 at 8:01 AM | Permalink

    Steve, good points all. I’ll follow up.

    Thanks,

    w.

  11. Ron Cram
    Posted Feb 18, 2007 at 9:09 AM | Permalink

    re: 7

    Dave,

    Thank you for the link. This discussion should probably be on the “Unthreaded” thread. SteveM or JohnA may decide to move it over there. This is an interesting topic as I have read Syun Akasofu make the same point that all of the strong warming is in the Arctic north. However, he did not provide any links or sources in my reading of his comments. Thanks again for the link. I will be watching for further discussion.

  12. Posted Feb 18, 2007 at 10:26 AM | Permalink

    Mistake Found in IPCC Report

    Compare Hansen radiative forcing factors to IPCC 2007 radiative forcing factors (page 4, figure spm-2).

    You’ll find all the forcing factors in Hansen fall within the ranges given in IPCC with the exception of black carbon. In Hansen, black carbon (BC) is given a forcing factor of 0.8 (a major factor second only to C02) but in the IPCC report it’s given a factor of 0.1 which makes it trivial.

    It appears that someone make a mistake in transcribing data from Hansen to IPCC.

    —Please move this wherever appropriate and email me or something so I can return to it. I though this was a pretty good find. A hundred collaborators on the IPCC report and they couldn’t get the Hansen data correctly transcribed into their report. That much difference in forcing factors makes a huge difference in all the conclusions that follow from it!

  13. TJ Overton
    Posted Feb 18, 2007 at 12:09 PM | Permalink

    Dave, it looks like both the Hansen 2005 and IPCC graphs show 0.1 for snow albedo.
    What the IPCC did was leave out the biomass burning and fossil fuels bit of the BC bar. This is partially offset by their inclusion of halocarbons.
    You’ll also notice that they significantly lowered the values for aerosol effects, which is rather nice for the IPCC since those are negative feedbacks.

  14. Paul Linsay
    Posted Feb 18, 2007 at 12:15 PM | Permalink

    Warwick Hughes had this interesting observation about close agreement between HadCRUT2 and MSU data over the USA. Does it still apply to HadCRU3?

    link

  15. Posted Feb 18, 2007 at 2:03 PM | Permalink

    Willis,
    I am amazed at the obfuscating reply from David Palmer, Information Policy Officer, University of East Anglia.
    To attempt to confuse the issue by saying in effect, “go get the GHCN data”.
    To discover what Jones et al have done you have to have the Jones et al data, not GHCN or GISS or NCAR or anyone else.

    My 2001 study “USSR High Magnitude Climate Warming Anomalies 1901-1996 ” at;
    http://www.warwickhughes.com/climate/ussr1.htm
    folow links to these four regions;

    Tarko Sale, Khanty-Mansi Region of Siberia
    Far Eastern Siberia, Sea of Ohotsk
    Lake Baikal Region
    Eastern Kazahkstan – Lake Balkhash

    where many graphics demonstrate considerable differences between station data versions used by Jones, GISS, GHCN.

    It is utterly laughable and surreal to think anybody can replicate what Jones et al have done with station data by studying GHCN station data.

    The best and quickest way I know now to get some understanding of how Jones et al treated station data is to read the 1988 Fred Wood critique and the Wigley and Jones reply, I have both papers scanned at;
    http://www.warwickhughes.com/cru86/wood.htm

    You can also read my Table where I list;
    [1] the 9 points where W & J say Wood was in error,
    [2] what I was able to find that Wood actually said and then,
    [3] a column of my comments to each of the 9 W & J points.

  16. Steve McIntyre
    Posted Feb 18, 2007 at 2:07 PM | Permalink

    Warwick, have you posted up the Jones et al 1990 on UHI?

  17. bruce
    Posted Feb 18, 2007 at 2:52 PM | Permalink

    Re #8: Good find. Perhaps there is a basis for bringing charges relating to breaches of the Data Quality Act. Any lawyers out there who can offer a view? I would have thought that there is a strong prima facie case that very serious breaches have been made.

  18. Posted Feb 18, 2007 at 3:43 PM | Permalink

    Steve,
    RE 16
    I have a rebuttal here;
    http://www.warwickhughes.com/papers/90lettnat.htm
    to the “Jones et al 1990 Letter to Nature:”,
    Jones PD, Groisman PYa, Coughlan M, Plummer N, Wangl WC, Karl TR (1990) Assessment of urbanization effects in time series of surface air temperatures over land. Nature 347:169-172
    A key brick in the IPCC wall.
    I will see if I can hunt down my copy and scan it.

  19. Posted Feb 18, 2007 at 6:31 PM | Permalink

    Re 16 Steve,
    The Jones et al 1990 Letter to Nature is now scanned (only 4 pages) at;
    http://www.warwickhughes.com/papers/90lettnat.htm

  20. Posted Feb 18, 2007 at 6:37 PM | Permalink

    Overton

    Dave, it looks like both the Hansen 2005 and IPCC graphs show 0.1 for snow albedo.
    What the IPCC did was leave out the biomass burning and fossil fuels bit of the BC bar. This is partially offset by their inclusion of halocarbons.

    That’s not the case. It’s the cause. If you read the research behind
    the Hansen BC data you’ll find that biomass and fossil fuel burning are the
    producers of black carbon. Morever, the latest research by NASA concludes
    up or over 25% of all global warming is due to black carbon changing planetary
    albedo. This is consistent with a forcing of 0.8 not 0.1.

    It’s not partially made up for by halocarbons. Halocarbons are labeled “CFC”
    in Hansen and labeled Halocarbon in IPCC. The forcing level for CFC in Hansen
    is exactly the same as for halocarbons in IPCC. CFCs ARE halocarbons which you
    can verify by looking up the terms.

    Either someone screwed up or BC data is purposely not shown reflecting the
    best data available on its forcing. Either way, the IPCC report has a very
    serious flaw that requires explaining.

  21. george h.
    Posted Feb 18, 2007 at 9:58 PM | Permalink

    re #19

    Warwick,

    In one of your papers cited by the Frazer Institute– http://www.fraserinstitute.ca/admin/books/files/ScienceIsntSettled.pdf –you compared the data from some 27 rural Australian stations 1880-1890 with data from a number of Australian cities over the same period. The results were amazing. Are you, or is anyone else aware, of any similar comparisons done with US temperature data?

  22. Posted Feb 19, 2007 at 3:49 AM | Permalink

    Re 21 george h,
    Those graphics in the Fraser Institute 2004 paper are from averaged BoM station data, compiled in 1991 after I read the Jones et al 1986 documentation.
    Jones PD , Raper SCB, Cherry BSG, Goodess CM, Wigley TML, (1986c) TR027 A Grid Point Surface Air Temperature Data Set for the Southern Hemisphere. Office of Energy Research , Carbon Dioxide Research Division, US Department of Energy. Under Contract No. DE-ACO2-79EV10098

    To put it mildly, I was surprised at the extent to which they used urban data and excluded many rural stations.
    I am not aware of any similar USA study but there were several late 1980’s US papers that examined “rural-urban pairs”, such as; G. Kukla, J. Gavin, and T.R. Karl (1986) Urban Warming, J. App. Met., Vol 25, pp. 1265’€”1270
    and;
    Karl TR, Diaz HF, Kukla G, (1988) Urbanization: its detection and effect in the United States climate record J. Climate 1:1099-1123
    Both available online at BAMS I think.
    http://ams.allenpress.com/perlserv/?request=get-archive

    Wood 1988 goes into these issues;
    http://www.warwickhughes.com/cru86/wood.htm
    and points out how Jones et al 1986 used ~20 urban stations from the station pairs of Kukla, Gavin, and Karl (1986).

  23. beng
    Posted Feb 19, 2007 at 9:48 AM | Permalink

    RE 7: DaveScot says:

    Flipping through the time sequence it’s obvious that almost all the heating anomalies are in the snow covered far north.

    This makes sense if the dominant forcing in the local areas are ice/albedo effects & not GHGs. IMHO, ice/albedo forcing dominates GHGs (which are comprised of mostly water vapor effects & supplimented only to a limited extent by CO2). Snow/albedo effects are dominating the radiative changes right now in my front lawn (low daytime highs compared to snow-free conditions).

  24. Steve McIntyre
    Posted Feb 19, 2007 at 2:40 PM | Permalink

    #19. Warwick, you gave the link to your rebuttal a second time. What’s the link to the scanned Jones 1990 paper?

  25. Bob K
    Posted Feb 19, 2007 at 2:55 PM | Permalink

    Steve,

    There are 4 pages linked at the top of Warwick’s rebuttal page that go to the scanned pages.

  26. Adrian James
    Posted Feb 20, 2007 at 7:07 AM | Permalink

    re 21

    I once came across this in a similar vein, although I am not sure of its provenance.

    Adrian

  27. Ron Cram
    Posted Feb 20, 2007 at 12:26 PM | Permalink

    re: 20
    Dave,

    I agree with you to a point. The way I understand it, the Hansen image shows “Black Carbon” with both direct (atmospheric) and indirect (snow albedo) effects. The direct effects are labelled “Fossil fuels” and “Biomass burning.” It seems to me Mann is saying that soot/particulate matter in the atmosphere has a warming effect. Whether this is true or untrue, I do not know. The question then is “Is the IPCC saying Hansen was wrong?” If so, what was Hansen’s error?

    The other question is why did the IPCC reduce so dramatically Mann’s view of the cooling from aerosols? What research has come out to prove Mann was wrong on this score? Just for fun – How would the IPCC image look if the cooling had been retained? Giving it an eyeball appraisal, it looks to me like the low end of the error bar on Total Net Anthropogenic might reach zero. What an embarrassment to the IPCC that would be!

  28. Ross McKitrick
    Posted Feb 20, 2007 at 1:35 PM | Permalink

    26: Jim Goodridge was the origin of the graph. However he later expressed the caveat that some of the effect may be due to the fact that the larger cities are nearer the coast and can be affected by coastal upwelling (I think that was the story). So while it’s visually suggestive, there would need to be some other confounding effects controlled for.

  29. Dave Dardinger
    Posted Feb 20, 2007 at 2:17 PM | Permalink

    re: #28 Ross,

    But cities on the coast would also be subject to sea breezes which should be cooler than typical land breezes. In any case, why would this change over time, regardless of whether a city grew in size or not?

  30. jae
    Posted Feb 20, 2007 at 4:51 PM | Permalink

    29: More SUVs. Seriously, population in those cities has increased, and people use energy ever more intensively. Even the people, themselves, give off heat at about 100 Watts, if I recall correctly. So a million people = 100 megawatts! Each car produces 25,000 Watts while cruising, and there are a hell of a lot of cars in CA.

  31. jae
    Posted Feb 20, 2007 at 5:23 PM | Permalink

    30 continued: Assuming LA has about 4 MM people, that’s 4 megawatts of people heat. Area of LA = 1,291 KM-2. So the people forcing factor (PPF) is 4 X 10^8 Watts/ 1.29 X 10^9 M = 0.3 Watts/M-2. Using IPCCs estimate of temperature rise for a 4 Watt increase in heat (1.5 to 4.5 deg C), the PPF translates into a temperature rise of 0.11 to 0.34 deg. C. If you add the cars, energy for buildings, etc., you get a very substantial increase. I could easily see a 1 degree gradual increase over the past 70 years.

  32. Dave Dardinger
    Posted Feb 20, 2007 at 7:06 PM | Permalink

    re: #31 Jae,

    In your first line that should be 400 megawatts of people heat (though you do have the the math worked properly so it’s strictly a typo.)

  33. Ron Cram
    Posted Feb 21, 2007 at 7:24 AM | Permalink

    re: 30

    Jae,
    Do you have a citation (hopefully a link) for people giving off 100 watts? That is certainly an interesting number.

  34. Dave Dardinger
    Posted Feb 21, 2007 at 8:33 AM | Permalink

    re: #33 Ron,

    Actually you shouldn’t need a citation. You should be able to calculate it quite easily. Here’s a hint. 1 horsepower = .74 kw. Looking farther in my CRC tables I see that a kcal is 1.16 watt hour. So 100 kcal = 116 watt hours. So 100*100*24/116 = 2068 kcal per day. This appears to be a bit larger than what a typical person might consume a day, so I expect it’s closer to 75 watts than 100. But at least it’s in the right order of magnitude. I suppose there’s also the matter of what amount of calories are excreted rather than burned, but again the figure isn’t far off.

  35. Ron Cram
    Posted Feb 21, 2007 at 10:22 AM | Permalink

    re: 34

    Dave,
    Thank you for the explanation. The number does seem reasonable now. However, for my purposes, I still need a citation. I’m here mainly because I hope to find information I can use to explain the science to Wikipedia readers. Wikipedia requires information be verifiable. So, if anyone has a citation that will take everything into account, including excretions, I would appreciate it.

  36. Steve Sadlov
    Posted Feb 21, 2007 at 10:28 AM | Permalink

    RE: #14 – The continuous conurbation of Southern Ventura County-LA-Orange County-Inland Empire has on the order of 16M poeple.

  37. jae
    Posted Feb 21, 2007 at 10:30 AM | Permalink

    Ron: here’s one reference. Just google human energy.

  38. Ron Cram
    Posted Feb 21, 2007 at 1:42 PM | Permalink

    re: 37

    Jae,
    I found this indicating the average seated relaxed person would give off 104 Watts of heat. This is in line with your post. However, what I would really like to find is a peer-reviewed article that applies this to climate as you have. Another Wikipedia policy is “No Original Research.” I cannot reproduce the calculations and expect other editors to accept it. I will keep looking.

  39. Ron Cram
    Posted Feb 21, 2007 at 4:01 PM | Permalink

    re:37

    Jae,

    I found this exchange on Roger Pielke Sr’s site:

    an amateurs question: Most of what I hear is related to radiation or insulative effects. The 20th century saw a massive increase in fuel usage from industry and cars. Is human heat generation taken into account in the models? Is it significant relative to the overall heat budget?

    Comment by Barry Wise ‘€” October 21, 2005 @ 6:01 am

    Barry-this is a really good question. On the local scale (such as in a heavily urbanized area), it can be a significant component on the surface energy budget. Text on pages 417-418 and Table 11-6 in my book Pielke, R.A., Sr., 2002: Mesoscale meteorological modeling. 2nd Edition, Academic Press, San Diego, CA, 676 pp. illustrates the magnitude of this heating.

    Surface heat input averaged over 10km by 10km grids of nearly 30 Watts per meter squared has been reported (by Harrison and McGoldrick, 1981), for example, for London and Birmingham in the United Kingdom, with over 600 Watts per meter squared over 1 kilometer squared areas in Teeside.

    Averaged on larger scales, however, the contribution of this surface heating is very small. Of more importance to the surface heating on these regional and larger scales is the effect of landscape conversion from the natural state, which alters albedo, the portioning of absorbed solar energy into its sensible and latent turbulent heat fluxes, and the surface emissivity of long-wave radiation.

    Comment by Roger Pielke Sr. ‘€” October 21, 2005 @ 7:54 am

    I thought you might find it interesting. Hopefully I can find the text book and quote it for Wikipedia.

  40. jae
    Posted Feb 21, 2007 at 4:20 PM | Permalink

    Interestingly, each person in the USA also releases about another 10,600 Watts through energy use.

    Per capita energy use in the USA is 7,795 kg oil equivalent. One kg oil = 43 x 10^6 joules. So per capita energy use ‘€” 7,795*43 x 10^6 = 335 x 10^9 joules/yr. If one assumes this energy use is spread out evenly throughout a year, it is equivalent to 335 x 10^9 / 3.15 x 10^7 sec/yr = 10,600 Watts.

    So energy use produces 106 times more heat than human metabolism. Multiplying my factors in #31 by 107 gives 32 watts/m^2 and a hypothetical temperature increase of 11.8 to 36.4 deg C. Of course, this doesn’t make sense, since much of the heat is dissipated outside the city, and since IPCCs sensitivity figures are bogus. However, if one uses the better sensitivity estimate of 0.1 deg/W, one obtains a 3.2 degree rise, which is entirely possible, since EPA says cities can be more than 10 deg. F hotter than surrounding areas).

    Then, you add the unknown amount of heat increase caused by converting greenish fields to concrete and asphalt. Just try to convince me that there’s no UHI effect!

  41. Jim Johnson
    Posted Feb 21, 2007 at 5:25 PM | Permalink

    #34

    “2068 kcal per day. This appears to be a bit larger than what a typical person might consume a day,”

    You’re kidding, right? Supersize that BK Stacker Quad and 2000 kcal is lunch.

    Brings up another point – even if population is constant, the heat dissapation from all of those fattening asses could probably cover the spread predicted by ‘global warming’. Not to mention the unmodeled CH4 …

    🙂

  42. Steve Sadlov
    Posted Feb 21, 2007 at 6:03 PM | Permalink

    RE: #41 – I saw that garlic extract was being tried to combat bovine CH4, however, when I consume garlic itself, we’re talking CH4 city …. think “Blazing Saddles” … on that “note” (pun intended) … 🙂

  43. Willis Eschenbach
    Posted Mar 8, 2007 at 5:14 PM | Permalink

    I have followed up with Mr. Palmer regarding the FOI request for Jones’ data as follows:

    Dear Mr. Palmer:

    Thank you for your reply (attached below). However, I fear that it is totally unresponsive. I had asked for a list of the sites actually used. While it may (or may not) be true that “it appears that the raw station data can be obtained from [GHCN]”, this is meaningless without an actual list of the sites that Dr. Jones and his team used.

    The debate about changes in the climate is quite important. Dr. Jones’ work is one of the most frequently cited statistics in the field. Dr. Jones has refused to provide a list of the sites used for his work, and as such, it cannot be replicated. Replication is central to science. I find Dr. Jones attitude quite difficult to understand, and I find your refusal to provide the data requested quite baffling.

    You are making the rather curious claim that because the data “appears” to be out on the web somewhere, there is no need for Dr. Jones to reveal which stations were actually used. The claim is even more baffling since you say that the original data used by CRU is available at the GHCN web site, and then follow that with the statement that some of the GHCN data originally came from CRU. Which is the case? Did CRU get the data from GHCN, or did GHCN get the data from CRU?

    Rather than immediately appealing this ruling (with the consequent negative publicity that would inevitably accrue to CRU from such an action), I am again requesting that you provide:

    1) A list of the actual sites used by Dr. Jones in the preparation of the HadCRUT3 dataset, and

    2) A clear indication of where the data for each site is available. This is quite important, as there are significant differences between the versions of each site’s data at e.g. GHCN and NCAR.

    I find it somewhat disquieting that an FOI request is necessary to force a scientist to reveal the data used in his publicly funded research … is this truly the standard that the CRU is promulgating?

    Thank you for your cooperation in this matter.

    Willis Eschenbach

    The beat goes on … sorry for my absence from the discussion, I’ve been on a consulting job in the Solomon Islands. Got to fly over the ITCZ a couple of times and look at the Hadley Cells from the air, always awesome.

    w.

  44. Willis Eschenbach
    Posted Mar 8, 2007 at 8:12 PM | Permalink

    HadCRUT2v is online here

    w.

  45. Jean S
    Posted Mar 9, 2007 at 2:11 AM | Permalink

    Both CRU and GISS temperature versions from 1992 are available from here. I wish someone had some time to take a look at those.

  46. Wolfgang Flamme
    Posted Mar 18, 2007 at 9:19 PM | Permalink

    Took monthly SST and ST(250km smoothing) from here:
    http://data.giss.nasa.gov/gistemp/time_series.html
    Then calculated the weighted mean by ocean/land area (3*SST+1*ST)/4.

    Compared this to the Jan2007-warmest-month-stuff:
    http://data.giss.nasa.gov/gistemp/tabledata/GLB.Ts+dSST.txt

    Plotted the difference.

    Results available here:
    GISS global mean comparison
    ???

One Trackback

  1. […] original FOI request here asked for: a list of the meteorological stations used in the preparation of the HadCRUT3 global […]