Part of the reluctance of many people, myself included, to jump on the anthropogenic (man-caused) global warming bandwagon is that it's not at all clear that the science on this matter is as settled as we've been told that it is. A recent article in Science Daily underscores the problem:
Planet Earth has warmed much less than expected during the industrial era based on current best estimates of Earth's "climate sensitivity" -- the amount of global temperature increase expected in response to a given rise in atmospheric concentrations of carbon dioxide (CO2). In a study to be published in the Journal of Climate, a publication of the American Meteorological Society, Stephen Schwartz, of Brookhaven National Laboratory, and colleagues examine the reasons for this discrepancy.
According to current best estimates of climate sensitivity, the amount of CO2 and other heat-trapping gases added to Earth's atmosphere since humanity began burning fossil fuels on a significant scale during the industrial period would be expected to result in a mean global temperature rise of 3.8�F -- well more than the 1.4�F increase that has been observed for this time span.
In other words, even though we're pumping CO2 into the atmosphere at historically high levels, the earth's temperature doesn't seem to be increasing nearly as much as current models predict that it should. Schwartz's analysis attributes the reasons for this discrepancy to a possible mix of two major factors:
1) Earth's climate may be less sensitive to rising greenhouse gases than currently assumed and/or 2) reflection of sunlight by haze particles in the atmosphere may be offsetting some of the expected warming.
There could be other factors at play as well. The point is that there seem to be a lot of uncertainties in our understanding of the dynamics of how CO2 is handled by the earth. We should thus be very cautious about imposing huge costs on industries and energy consumers to disincentivize their use of energy until we have a more thorough grasp of those dynamics.
RLC