Thursday, November 13, 2008

Our Sun is the Elephant in the Room

When humans get sick, they generally run a fever. From this well know reality, we, as patients under any kind of medical care, will always have our temperature taken. We know that fever is a clear symptom of illness. No one, scientifically, would ever believe that a fever is the actual cause of someone being sick.

What if the high amounts of carbon dioxide are a lot like that fever. Scientists drilling core samples all over the world are finding high levels of carbon dioxide at the times when the earth was warm. At the same time, they have found the lowest levels of carbon dioxide at times when the earth was cooler; especially at the times of the many ice ages that have gripped this earth. So, the simplistic conclusion has always been that carbon dioxide causes earth's warming and it is responsible for our Global Warming trend of today. This has been hypothesized by the concept of the "Greenhouse Effect" of high levels of carbon dioxide in the earth's atmosphere.

But, think about this. The earth emerges out of every Ice Age when carbon dioxide levels are extremely low. Similarly, the earth begins to cool at times when carbon dioxide levels are often very high. Historically, all of this cooling and heating activity occurred ages before man ever started polluting the earth. So this begs the question: Is carbon dioxide a cause or a measurable symptom? Is it like a fever?

Many dissenters against the theory that Global Warming is man made, believe that our sun is the elephant in the room that most every scientist seems to ignore. That's because they are incapable of plugging the sun's behavior into any Global Warming computer models. We just don't have any correlated historical data suggesting what the sun's lack of solar activity will do over time. However, we do know that the lack of solar activity occurred at a time of a sudden and relatively recent ice age called the Little Ice Age. That mini ice age gripped Europe and North America from 1400 to 1800 A.D. During that time, the sun entered what is called a "Maunder minimum" which is a time when there is little or no sunspot activity. In fact, we have had a similar lack of solar activity over the last two years (See Full Story). This fact, alone, has led some to predict that we are actually entering a period of earth cooling and not a period of global warming.

There are just too many contradictions to the concept of man-made global warming. Take, for example, this recent story about the decline of 90 degree plus days in Chicago (See Full Story). If there was truly a trend towards global warming, Chicago should be seeing higher highs in temperature. But it isn't. Also, other planets, not inhabited by man, are currently suffering from their own global warming (See Full Story). Since the hurricanes Rita and Katrina, the overall hurricane seasons have been relatively calm in direct contradiction to the hypothesized predictions of some global warmists. I have covered this contradictory evidence of Global Warming in a prior blog entry titled: "The Cooling Realites of Global Warming".

As I have said before, I don't deny that we "have been" in a period of global warming. The issue is whether or not we, as mere humans are (or even could be) the most important factor in that process. All of outer space is a damned cold place. If it weren't for our sun and our relative proximity to it, this earth would be one, big ice-ball. Just look at the effects of our earth's rotation. From Winter to Summer, we can have changes in our temperature as wide as 120 degrees in a year's time. So, to me, it would be illogical to assume that a less sun active won't have some major effect on this blue planet we call home. Just my opinion.

No comments: