Reposting a comment by HarleyDavidson, commenter at J.R. Dunn's post, "Global Warming Fraud and the Future of Science," at American Thinker:
What Climategate really means: So easy even a CAVEMAN can understand.How do they manage it? Read about the secret life of climate researchers in Iowahawk Geographic.
ClimateGate exposed the cabal of 20 – 30 scientists (not just at CRU) that peer reviewed each others papers, strong-armed scientific journals to only print their views, and then sat on the IPCC panels as authors judging which published studies go into the IPCC final reports. This is why they always keep shouting “peer reviewed studies, peer reviewed studies, peer reviewed studies”. They owned the peer review process.
ClimateGate exposed that this small group has been adding positive corrections to the raw global temperature data, inflating the amount of published temperature rise over the last 50 years. Both CRU in the UK and NASA-GISS in the US add these biases. At CRU, the programmers did not even know what and why some corrections were added every month. Only since satellite monitoring for comparison have the amounts of biasing leveled off.
ClimateGate exposed the leaders of this cabal instructing each other to delete emails, data files, and data analysis programs ahead of already filed Freedom Of Information Act requests for raw data and computer codes, clearly a crime.
ClimateGate exposed the “trick” about the Hockey stick figure and other studies that performed proxy construction of past temperatures. After all, reconstruction of the last 1,000 years of climate is the first step in predicting the future with super computer programs as explained below:
Everything about all 21 super computer programs used by the IPCC to determine future global warming rely on best-determined past sensitivities to solar and volcanic effects (climate forcings) from the proxy temperature record.
1. The elimination of the Medieval Warm Period and the Little Ice Age (the handle of the hockey stick) was necessary so that past solar effects could be minimized, thereby allowing almost all of the warming in the last 75 years to be blamed on Greenhouse Gasses. Raw data (like tree-ring thickness, radioisotope of mud layers in a lake bottom, ice core analyses, etc.) are used as a proxy for reconstruction of the temperature record for 1000 AD to 1960 AD. To ensure desired results, statistical manipulation of the raw data and selecting only supporting data, cherry-picking, was suspected and later proved.
2. The slope of long-term 10-year running average global temperature using thermometers from 1900 to present (the blade of the hockey stick) was maximized with the sloppy gridding code, Urban Heat Island effects, hiding the declines, and even fabricating data (documented in the leaked source code comments revealed with ClimateGate). This ensured that the Greenhouse Gas effect coefficient in all 21 of the super computers was maximized, and that maximizes the temperature result at year 2100 based on Greenhouse Gas increases. This thermometer data was used to replace the tree ring-divergence after 1960 and plot this over the climate history data of (1) above giving the false impression that the reconstructed 1000 AD to 1960 AD results are more accurate than they are.
3. Because tuning of the super computer programs uses back casting, the computer outputs could always replicate the 20th Century (by design); therefore it was assumed that the models had almost everything in them. Because of (1) and (2) above, nearly all climate change predicted by the models was due to CO2 and positive feedbacks and hardly any of the climate change was for other reasons like solar, understood or not.
4. Over the years, when better numbers for volcanic effects, black carbon, aerosols, land use, ocean and atmospheric multi-decadal cycles, etc. became available, it appears that CRU made revisions to refit the back cast, but could hardly understand what the code was doing due to previous correction factor fudging and outright fabricating, as documented in the released code as part of ClimateGate.
5. After the IPCC averages the 21 super computer outputs of future projected warming (anywhere from 2-degrees to 7-degrees, not very precise), that output is used to predict all manner of catastrophes. (Fires, floods, droughts, blizzards, hurricanes, tornadoes, earthquakes, insects, extinctions, diseases, civil wars, cats & dogs sleeping together, etc.)
__________
Related: