Home > dmsp, GHCN, GIStemp > GISTEMP: There's a light burning in the fireplace

GISTEMP: There's a light burning in the fireplace

2010 May 19

Introduction

Herein lie some sample runs with the new GISTEMP code, comparing and contrasting my results with the web published, and comparing and contrasting the public GISS v2.inv file for metadata with my self-generated v2.giss.inv metadata file.

Downloads
Downloaded the GISTEMP source
http://data.giss.nasa.gov/gistemp/sources/GISTEMP_sources.tar.gz

Unzipped the file.
Changed directory to GISTEMP_sources/STEP0/input_files

Downloaded GHCN v2.mean
ftp://ftp.ncdc.noaa.gov/pub/data/ghcn/v2/v2.mean.Z

Downloaded 9641C_201005_F52
ftp://ftp.ncdc.noaa.gov/pub/data/ushcn/v2/monthly/9641C_201005_F52.avg.gz
Copy 9641C_201005_F52 to USHCNv2.avg

Downloaded new Antarctic data files
(use this handy-dandy script)
http://rhinohide.org/rhinohide.cx/co2/gistemp/201005/get_SCAR.pl.txt

Code and Script

The scripts seem cleaner than the last time I looked at them. However there are two changes that still need to be made for most Linux boxen. First, change the “tail +100” to “tail -100” in the do_comb_step0.sh script. Second, search all the scripts for calls to executables and other scripts with out an explicit directory path. In other words, change calls like “zonav.exe” to “./zonav.exe”

The code is cleaner too. No need to monkey with PApars.f to remove the infinite loop for those of us with lower precision tools. The logic to handle the new ‘night light flags’ or retain the old behavior (US = nightlights, ROW = Rural/Urban) is in the text_to_binary.f code.

Don’t forget to compile the two python modules. I see that the older unused modules were removed. And speaking of compiling, don’t forget to set and export a variable FC to define your fortran compiler for the ‘jit’ compiling that occurs in the scripts. In my case: export FC=f95

You will have to create the “STEPx\temp_files” directories. In STEP 2, you will have to create the “work_files” directory.

GISS WEB -v- GISS LINUX

Comparing the GISS web release of GLB.Ts.txt with the GLB.Ts.GHCN.CL.txt generated on my gfortan Ubuntu 9.10 Sun Vbox using the public source code.

In all of these comparisons, red is the color of the first listed data source. Sorry about the odd scale on the ‘diff’ charts. The perils of sw reuse (these are some older perl scripts I really should rewrite in R)

Web -v- Linux
diff Web -v- Linux

GISS Old Rural/Urban -v- New Nightlights

Here we compare the linux-compiled GISS code using the traditional setting (US=nightlights, ROW = rural/urban)

Rural -v- Night
diff Rural -v- Night

GISS Rural -v- Whiteboard Rural

Here we use the older rural/urban method for PApars and compare the GISS v2.inv values as delivered in the source code with the newly generated Whiteboard v2.giss.inv values.

GISS -v- Whiteboard Rural
diff GISS -v- Whiteboard Rural

GISS Nightlights -v- Whiteboard Nightlights

Here we use the newer nightlights method for PApars and compare the GISS v2.inv values with the newly generated Whiteboard v2.giss.inv values.

GISS -v- Whiteboard NightLights
diff GISS -v- Whiteboard NightLights

Discussion

There is very good matching between the ‘web’ version of the GISTEMP land record and the one that I generated – until we get past 1980. I suspect that this difference arises from different versions of the USHCN file being used.

There is relatively strong divergence between the GISS v2.inv files and my self-generated v2.giss.inv files in the early years of the record. I suspect that this is due to sensitivity to the definition of rural/urban or dark/bright when relatively few stations are available.

Overall, I am satisfied that my self-generated rural/urban and dark/bright fields are doing the same job as the GISS/GHCN generated values.

Advertisements
  1. 2010 May 19 at 11:18 pm

    As a side note, I may be shopping for a new laptop soon. Anyone have favorites when it comes to number-crunching?

  2. Steven Mosher
    2010 May 20 at 2:54 pm

    arrg. I’m more interested in the changes in antarctica. what’s going on there? I just spent a few miserable days reading and parsing those pages in R ( hehe and found a bug in the regexpr of R)

  3. 2010 May 20 at 4:45 pm

    Hmmm? I haven’t looked specifically at Antarctica. Wassup?

  4. Steven Mosher
    2010 May 20 at 10:02 pm

    I was talking about your script to get the scar stuff

  5. 2010 May 21 at 5:32 am

    Not my script.
    Date stamp on mine comes from Dec 2009,
    so it probably came with the GISTEMP tarball I downloaded in Nov.
    Doesn’t seem to be in the latest and greatest tarball.

    The ‘give-away’ that this isn’t mine is the comments in the beginning
    # remove Gough, Marion, Mario-Zucchelli
    # add Terra_Nova_Bay

    I’d have no reason to do that.
    It’s a good follow-up question: why remove those, add that?

  6. 2010 May 21 at 8:33 am

    Nice work, its good to see how your independently constructed metadata compares to the original metadata in practice.

    As far as good number-crunching laptops go, I’d suggest the new macbook pros, but they are a tad on the pricy side.

  7. 2010 May 21 at 9:55 am

    My wall-wart is churning away at an estimated 23 hour data build begun last night.
    Thought of a more effecient algorithm this am, so set my laptop on that build.
    We’ll see if either is done when I get home tonight. 🙂

  8. steven Mosher
    2010 May 21 at 12:50 pm

    Ill look at them. I just started at the base page, did a scrape to get the next page, then a scrape to get the data from antarctic 1 2 and 3. then found the duplicates ( gough was one ) then just built a file with all the antarct metadata and data including the dups. not sure what they are doing.. undocumented hand fixes.. no doubt for good reason..but what

  9. steven Mosher
    2010 May 21 at 12:54 pm

    Mac laptop here. pretty painful to work on. Also, might take a look at geonames API in R.
    you have to throttle your requests but you can suck down some much needed info, like proper country codes and properly spelt place names, and feature data. Or you can just do your own requests outside of R..

    geonames.org

  1. No trackbacks yet.
Comments are closed.