Question:
Source code not available?
Jeff M
2012-03-14 02:07:31 UTC
There have been numerous times in here when opponents of global warming have stated that source code for various paleoclimate reconstructions are not to be found. I found a page that contains source code and original data but am unsure what codes and data are present as there are various reconstructions included.

http://www.ncdc.noaa.gov/paleo/pubs/mann2008/mann2008.html

Do you think we can put the 'no code' statement to rest? And if you have a problem with the code after looking through it could you explain why and where?
Nine answers:
Joe Joyce
2012-03-14 06:05:15 UTC
Hi, Jeff. The data source page at RealClimate has links to almost everything. Here's the part for codes. Each line is a link to that specific item. This stuff has been available for a while, but that will not stop deniers from lying about it.



Model codes (GCMs)



Downloadable codes for some of the GCMs.



GISS ModelE (AR4 version, current snapshot)

NCAR CCSM(Version 3.0, CCM3 (older vintage))

EdGCM Windows based version of an older GISS model.

Uni. Hamburg (SAM, PUMA and PLASIM)

NEMO Ocean Model

GFDL Models

MIT GCM



Model codes (other)



This category include links to analysis tools, simpler models or models focussed on more specific issues.



Radiative Transfer models (AER RRTM)

Rahmstorf (2007) Sea Level Rise Code

Vermeer & Rahmstorf (2009) Sea Level Rise Code and Data

ModTran (atmospheric radiation calculations and visualisations)

Various climate-related online models (David Archer)

Integrated Assessment Models (IAMs) (FUND, FAIR, DICE, RICE)

CliMT a Python-based software component toolkit

Pyclimate Python tools for climate analysis

CDAT Tools for analysing climate data in netcdf format (PCMDI)

RegEM (Tapio Schneider)

Time series analysis (MTM-SVD, SSA-MTM toolkit, Mann and Lees (1996))

MAGICC
John Rockford
2012-03-14 02:35:21 UTC
Code was revealed after climate gate and there are some interesting parts. Here is an example

The programmer has written in helpful notes that us non-programmers can understand, like this one: “Apply a very artificial correction for decline”. You get the feeling this climate programmer didn’t like pushing the data around so blatantly. Note the technical comment: “fudge factor”.



; Apply a VERY ARTIFICAL correction for decline!!

;

yrloc=[1400,findgen(19)*5.+1904]

valadj=[0.,0.,0.,0.,0.,-0.1,-0.25,-0.3,0.,- 0.1,0.3,0.8,1.2,1.7,2.5,2.6,2.6,$

2.6,2.6,2.6]*0.75 ; fudge factor

if n_elements(yrloc) ne n_elements(valadj) then message,’Oooops!’



The numbers in a row, in the [ ] brackets, are the numbers the data are to be altered by. If there were no adjustments, they’d all be zero. It’s obvious there is no attempt to treat all the data equally, or use a rigorous method to make adjustments. What could their reasons be?



East Anglia Data Adjustments



In 1900-1920: “All thermometers working accurately”.



In 1930: “Stock market crash and global depression causes artificial inflation in temperatures. Corrected, using inverted Dow Jones index until 1940?.



1940: “Due to WWII, briefly, thermometers work again”.



1945: “Artificial rise due to Nagasaki/Hiroshima effect. Compensated.”



1950 – 2000: “Quality control at thermometer factories must be going to pieces. Thermometers are just reading too low, and it kept getting worse until 1970. Instead of demanding the factories get it right, simply adjust the data. Still not enough. Quality control puts air-conditioning exhaust vents close to thermometers in the field, to further counteract apparent factory problem.”



The reason they didn't want to release the code is that it exposed the fraudulent nature of the programing to show warming.
Gary F
2012-03-14 11:10:07 UTC
The *.m files are MATLAB command line macros.



It has always been unclear to me what Deniers meant by "code" - probably because 99% of them are just repeating what they have heard on Rush and FOX News and have no clue what it means.



I think a lot of the original btching had to do with processing climate station data to resolve homogeneity and stationarity issues - even though those methods are fairly standardized, routine, and readily available from any number of sources. But, again, that suggests that the complaints were nothing but the uninformed parroting of lies by nitwits.
?
2012-03-14 06:36:21 UTC
It really does not matter if it was available already or not as evidenced by John Rockfords' answer. He admits he is a non-programmer yet despite that admitted lack of skills he fails to be skeptical of his own non-expert interpretation of source code. Wow!



In Denial Fantasy-land everything is possible: an English Lord becomes a 'climate expert', weathermen change the law of physics, Republican Congressmen can attribute Global Warming to God's Will and non-programmers can all of a sudden read and comprehend complex climate science related source-code.
Ian
2012-03-14 09:14:54 UTC
@ Ottawa Mike..."Arguments for providing code are full transparency of the analysis and that discrepancies and errors may be easier to identify."



Yep, that is a pretty good reason for providing it. I think what Jeff M and other alarmists mean by "putting it to bed" is that skeptics should never question adjustments again and just accept whatever climatologists come up with. They can't stand the fact that Steve McIntyre found an error in their code and they had to make corrections.



Just closing their eyes and believing that climatologists are infallible and unbiased makes it true for them.
Sagebrush
2012-03-14 03:24:45 UTC
If you are a programmer and your boss tells you to do something you either do it or you get fired. Which would you do? You would do as your boss says but you would 'comment' it. This is so you or your associates could readily identify that area and either alter it again and or cancel it out.



Some of this is honest since a lot of programmers do a lot of 'what ifs'. The real good programmers usually delight in this as they usually are an inquisitive bunch.



But it shows how easily these numbers can be fudged in order to reach a goal. In the disclaimer it reveals a weakness in the system data collection. "Though a great deal of quality control is performed, some errors will remain in the data, often due to problems in data transmission by each station location." That just isn't so. There are ways to verify and certify data that would eliminate this problem. They do it in banking all the time. There is no credible reason for this unless they are dealing with shoddy equipment. What are we paying all these billions of dollars for?



Anyway, thanks for the source Jeff M. It is a starting point.



Jeff M: I though that what I was saying. I was agreeing with you (for once). I appreciate your coming up with that site. I have been looking for it but as of yet I haven't analyzed it. I don't have an opinion one way or another. I was skeptical about their releasing it and the data. If anything I was pointing out the caveat of their saying there may be different data arrays out there. This should not be so with the technology that we have.
2012-03-14 13:39:39 UTC
Unfortunately, denialists are not interested in actually investigating what they see on denialist blogs and Faux news. We can expect them to take Phil Jones' comments on the statistical significance of the warming trend from 1995-2009 to be twisted to, "There has been no warming for fifteen years," for the next 50 years.
2016-05-17 13:42:32 UTC
I prefer SMF Forums myself. Open Source, PHP, huge base of people who make add ons for it. Secure, powerful.
Ottawa Mike
2012-03-14 07:47:42 UTC
"Do you think we can put the 'no code' statement to rest?"



Why do you continue to insist "putting things to rest"? This is issue is far from being put to rest.



"At present, debate rages on the need to release computer programs associated with scientific experiments(2, 3, 4), with policies still ranging from mandatory total release to the release only of natural language descriptions, that is, written descriptions of computer program algorithms." Nature Volume: 482, Pages: 485–488 Date published: (23 February 2012)

http://www.nature.com/nature/journal/v482/n7386/full/nature10836.html



From the IPCC: "Providing code is encouraged, but there was no consensus among all participants about whether to recommend providing all code to a public repository. Arguments for providing code are full transparency of the analysis and that discrepancies and errors may be easier to identify. Arguments against making it mandatory to provide code are the fact that an independent verification of a method should redo the full analysis in order to avoid propagation of errors, and the lack of resources and infrastructure required to support such central repositories." http://www.ipcc.ch/pdf/supporting-material/IPCC_EM_MME_GoodPracticeGuidancePaper.pdf



I know that you'd love to just put all this debate to bed and get on with saving the world but that isn't going to happen no matter how many times you click your heels together.



Edit: Jeff M: "I see no 'skeptic' wants to actually answer my question nor look at the data provided.."



I did exactly answer your question. Do you think we can put the 'no code' statement to rest? Answer: NO. Analysis, see above. I'm not qualified to analyze the data given in your link to see if it allows enough transparency for an attempt at replication but I do know that giving one example is not enough to allow the much more general statement "to rest".


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...