Archive
Natural Variability 1 – One World, Two Cuts
This is a sparse post. It just a test spin on my way to looking for regional trends. Lots of tables, little code.
Did Global Cooling Stop in 1970?
The 1940 trends in the previous post gave us a look into a rising trend that did break into a ‘no trend’ and then a ‘cooling’ trend – for another 30 years. So I thought it would be interesting to look back from 1970. The 30 trend in 1970 actually shows a slight cooling … and reveals a bug in my code. The ‘no trend’ triangle should now be the triangular area above the trend when the trend is falling. (And I guess the chart title is off as well!)
GHCN Station History: A Pretty Chart II
Introduction
I presented a similar chart for GSOD data a few weeks back. Looks like it is time to present another for GHCN. These are the stations in v2.mean for the dates indicated.
Static images for every 10 years below the fold.
trb-0.10: GSOD debut
Introduction
Fairly minor changes, but I check out the GSOD data set, as well as adding some more raster plotting pieces.
trb-0.09: Planet Ocean
Introduction
Source: http://www.livingoceanproductions.com/Living_Ocean_Productions/Ocean_Conservation.html
How inappropriate to call this planet Earth when it is quite clearly Ocean.
-Arthur C. Clarke
trb-0.08: Pretty Pictures
Introduction
There’s a green one and a pink one
And a blue one and a yellow one,
And they’re all made out of ticky tacky
And they all look just the same.
GHCN: A Preview of Version 2
Now that GHCN v3 is almost upon us … a look back to a time when GHCN v2 was almost upon us …
(and, yes, the section on Quality Control really does have two sections 3.2 :lol:)
The Global Historical Climatology Network: A Preview of Version 2
3. QUALITY CONTROL
GHCN version 2.0 will be primarily constructed from what Guttman (1991) calls “secondary data sources … that have been compiled, adjusted, or summarized by anyone other than the researcher using the data.” When using such data, it is imperative that data “validity” and “accuracy” be assessed. “Validity” refers to whether the data fit the particular application; “accuracy” refers to the reliability of the data. To address both concerns, an extensive quality control procedure was developed. In theory, a quality control procedure is intended to ensure that data meet certain standards of excellence. In practice, quality control implies looking for gross data errors. These “errors” take many forms, ranging from inappropriate data to magnetic media problems to formatting errors to data gaps to outliers to time series inhomogeneities. The procedure employed in GHCN version 2.0 is designed to catch the most pervasive and/or egregious problems that were discussed at the International Workhop on Quality Control of Monthly Climatic Data (Peterson, 1994). The procedure consists of four parts:
1. selection of data sets,
2. preprocessing of data files,
3. duplicate station elimination, and
4. outlier checks.
Kuska and Serahs
Introduction
RomanM has a recent post up looking at “duplicate” station records in GHCN. I decided to follow up on one of those that he had selected: Kuska.
trb-0.07: Performance Refactoring
Introduction
I was hamstrung by slow performance. Too many ‘for loops’, not enough array processing. It wasn’t a problem until I tried a 36×72 grid with 2500 grid cells. Version 0.05 took 25 hours. I improved that yesterday in v 0.06: only 8 hours. But that still stank. This morning I finally found the right tool.