Let us suppose I create a computer program that predicts the price of gold for the next century. It takes into account every historical trend, every relationship between the price of gold and every other traded commodity over the last century, and anything else that might make the model more accurate. I run the model back in time, and it is spot-on. It should be, because it depends on historical data. I run it forward and it tells me the price of gold in 2047 will be 1.267 trillion American dollars per oz.. Should I trust that result?

Computer models take a current situation, do some computation, and tell you what the model predicts. If my gold price prediction program can’t predict the price of gold week-by-week, what use is it? If it could, I’d be a trillionaire.

My point is that climate models would have to be accurate on any time-scale to be taken seriously. They are not, so they are worthless.

Update:

I should add that computer models iterate through a timescale. Given the conditions at point of time t1, they calculate the conditions at point of time t2. They then take the calculated conditions at point of time t2, and calculate the conditions at point of time t3. Pretty obviously, if the model is not quite perfect, and the predicted conditions at point of time t2 do not match reality, then the computation based on t2 will not match reality either. In fact, the mismatch between reality and the model will get worse, the further they go down the timescale.

Update 2:

If the AGW climate model was correct; i.e. that human activity outweighs all other climate influences, and is having a potentially catastrophic effect on climate, then the temperature record going back in time should look like a hockey-stick. In reality it doesn’t look like anything like a hockey-stick. Mann and company were using every trick in the book to make past temperature variations go away. I have one question for Mann. I’d ask him when will the next ice-age start.

Advertisements