CRU Souce Code Decoded: Marc Sheppard Finds the Fudge Factor

Marc Sheppard writing for American Thinker freezes-in-time the CRU warming source code.  Here’s a pullout quote that you and I can understand:

But here’s what’s
undeniable: If a divergence exists between measured temperatures and
those derived from dendrochronological data after (circa) 1960, then
discarding only the post-1960 figures is disingenuous, to say the
least. The very existence of a divergence betrays a potential serious
flaw in the process by which temperatures are reconstructed from
tree-ring density. If it’s bogus beyond a set threshold, then any
honest man of science would instinctively question its integrity prior
to that boundary. And only the lowliest would apply a hack in order to
produce a desired result.

to do so without declaring as such in a footnote on every chart in
every report in every study in every book in every classroom on every
website that such a corrupt process is relied upon is not just a crime
against science, it’s a crime against mankind.
Bottom line:  CRU’s evidence is
now irrevocably tainted. As such, all assumptions based on that
evidence must now be reevaluated and readjudicated. And all policy
based on those counterfeit assumptions must also be reexamined.

Al Gore Fudge Factor

The following is part of Sheppard’s reveal of the CRU source code. Go to American Thinker to read his dialogue before and after the following:

One can only imagine
the angst suffered daily by the co-conspirators, who knew full well
that the “Documents” sub-folder of the CRU FOI2009 file contained more
than enough probative program source code to unmask CRU’s phantom

fact, there are hundreds of IDL and FORTRAN source files buried in
dozens of subordinate sub-folders. And many do properly analyze and
chart maximum latewood density (MXD), the growth parameter commonly
utilized by CRU scientists as a temperature proxy, from raw or
legitimately normalized data. Ah, but many do so much more. 

Skimming through the often
spaghetti-like code, the number of programs which subject the data to a
mixed-bag of transformative and filtering routines is simply
staggering. Granted, many of these “alterations” run from benign
smoothing algorithms (e.g., omitting rogue outliers) to moderate
infilling mechanisms (e.g., estimating missing station data from that
of those closely surrounding). But many others fall into the precarious
range between highly questionable (removing MXD data which demonstrate
poor correlations with local temperature) to downright fraudulent
(replacing MXD data entirely with measured data to reverse a disorderly

In fact, workarounds for the post-1960 “divergence problem,” as described by both RealClimate and Climate Audit,
can be found throughout the source code. So much so that perhaps the
most ubiquitous programmer’s comment (REM) I ran across warns that the
particular module “Uses ‘corrected’ MXD – but shouldn’t usually
plot past 1960 because these will be artificially adjusted to look
closer to the real temperatures.

What exactly is meant by
“corrected” MXD,” you ask? Outstanding question — and the answer
appears amorphous from program to program. Indeed, while some employ
one or two of the aforementioned “corrections,” others throw everything
but the kitchen sink at the raw data prior to output.

For instance, in the subfolder “osborn-tree6\mann\oldprog,” there’s a program (
that calibrates the MXD data against available local instrumental
summer (growing season) temperatures between 1911-1990, then merges
that data into a new file. That file is then digested and further
modified by another program (, which creates calibration statistics for the MXD against the stored temperature and “estimates

(infills) figures where such temperature readings were not available.
The file created by that program is modified once again by, which
corrects it – as described by the author — by identifying and artificially removing the decline.  
But oddly enough, the series doesn’t begin its decline adjustment in 1960 — the supposed year of the enigmatic divergence. In fact, all data between 1930 and 1994 are subject to “correction.”

And such games are by no means unique to the folder attributed to Michael Mann.

A Clear and Present Rearranger

In two other programs, and, the correction is bolder by far. The programmer (Keith Briffa?) entitled the adjustment routine “Apply a VERY ARTIFICAL correction for decline!!
And he or she wasn’t kidding. Now IDL is not a native language of mine,
but its syntax is similar enough to others I’m familiar with, so please
bear with me while I get a tad techie on you.
Here’s the fudge factor (notice the brash SOB actually called it that in his REM statement):

valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,-0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,2.6,2.6,2.6]*0.75         ; fudge factor
These two lines of code
establish a twenty-element array (yrloc) comprising the year 1400 (base
year, but not sure why needed here) and nineteen years between 1904 and
1994 in half-decade increments. Then the corresponding
fudge factor
(from the valadj matrix) is applied to each interval. As you can see,
not only are temperatures biased to the upside later in the century
(though certainly prior to 1960), but a few mid-century intervals are
being biased slightly lower. That, coupled with the post-1930
restatement we encountered earlier, would imply that in addition to an
embarrassing false decline experienced with their MXD after 1960 (or
earlier), CRU’s
divergence problem also includes a minor false incline after 1930.

the former apparently wasn’t a particularly well-guarded secret,
although the actual adjustment period remained buried beneath the
Plotting programs such as print this reminder to the user prior to rendering the chart:

data after 1960 should not be used. The tree-ring density records tend
to show a decline after 1960 relative to the summer temperature in many
high-latitude locations. In this data set this “decline” has been
artificially removed in an ad-hoc way, and this means that data after
1960 no longer represent tree-ring density variations, but have been
modified to look more like the observed temperatures.

Others, such as, issue this warning:

recent decline in tree-ring density has been ARTIFICIALLY REMOVED to
facilitate calibration. THEREFORE, post-1960 values will be much closer
to observed temperatures then (sic) they should be which will
incorrectly imply the reconstruction is more skilful than it actually is. See Osborn et al. (2004)

Care to offer another explanation, Dr. Jones?…

Marc Sheppard is a technology consultant, software engineer, writer, and political and systems analyst. He is a regular contributor to American Thinker, The New Media Journal, Opinion Editorials and Men’s News Daily.