I might add to this article that the world, the democrats and the left news media seem to feel that global warming, climate change or whatever the wish to call it is more important than terrorism. This became obvious after we withdrew from the Paris accord.
When it
comes to the global warming debate, both alarmists and critics agree on one
thing: The earth has warmed by roughly 0.8 degrees Celsius over the past 150
years. It’s the cause of this warming, however, that remains in dispute. And
while the public is constantly bombarded with messages about the evils of
carbon dioxide emissions, there are actually compelling reasons to believe that
contemporary global warming has been driven by rising solar output, not carbon
dioxide.
To
start with the basics, when the sun’s ultraviolet radiation strikes the earth’s
surface, it is re-emitted as heat in the form of infrared radiation (IR).
Carbon dioxide is one of the atmospheric gases that absorb and re-radiate this
heat.
Initially,
CO2 is particularly effective at absorbing IR. In fact, in concentrations as
small as 0.002 percent of the atmosphere (20 parts per million), CO2 starts
intercepting as much IR as it can.
But CO2
also has a pretty surprising limitation. It very quickly starts to lose the
ability to absorb more heat. CO2's heating function declines logarithmically,
which means that it actually takes ever-doubling amounts of CO2 to keep soaking
up the same amount of heat.
Essentially,
CO2 quickly renders the atmosphere opaque to a certain spectrum band of IR. And
once that task is finished, it takes gargantuan amounts of additional CO2 to
gain even the slightest bit of additional IR absorption.
At the
current concentration of 0.04 percent (400 parts per million), CO2 is
essentially "saturated" throughout the atmosphere. So adding
more CO2 to the atmosphere cannot meaningfully contribute to further heat
absorption.
It's a
rather stunning fact that CO2's heating function declines so rapidly. But the
general public likely has no awareness of this handicap, and climate activists
are loath to discuss it. If pressed, they'll grudgingly acknowledge CO2's
logarithmic limitations. They then gloss over the matter by proposing that
water vapor will serve as the "positive feedback" needed to greatly
magnify any future warming from CO2.
Significantly,
water vapor is the primary "greenhouse" gas of the atmosphere — and
is responsible for the overwhelming majority of heat absorption necessary to
sustain life on the planet. Proponents of Anthropogenic (man-made) Warming
Theory (AGW) hypothesize that, even though CO2 may be optically saturated at
the current concentration of 0.04 percent, any increase in CO2 will still offer
some minor increase in atmospheric heat content. They believe that this small
quantity of additional heat will lead to a corresponding increase in
atmospheric water vapor. And since water vapor is the main heat-absorbing gas
of the atmosphere, that increased water content will absorb more heat, which
will raise temperatures further, which will yield more water vapor, etc.,
creating a positive-feedback loop for additional warming.
This
feedback assumption is the formula built into every computer model that now
predicts future warming. But it contains a major flaw — and one that its
proponents have never actually managed to solve. Water vapor added to the
atmosphere inevitably transitions to clouds. And cumulus clouds not only
reflect solar radiation back into space but also produce rain. And rainfall not
only lowers surface temperatures but also scrubs CO2 from the atmosphere.
This
cloud problem weighs heavily at the very heart of AGW theory, and it means
that, in order to accept the proposition of man-made warming, one has to
believe that the net function of cloud formation is to warm the earth.
Climate
advocates essentially ignore this cloud problem, however. Or they hypothesize
that, as long as relative humidity remains constant, CO2 can force a positive
water vapor feedback for future warming.
Overall,
though, the fact that CO2 is now optically saturated in the atmosphere — and
that cloud formation contradicts water vapor feedback — means that AGW theory
rests on some shaky underpinnings. But if the earth has warmed by a net 0.8
degrees Celsius over the past 150 years, undoubtedly something has driven up
global temperatures. If not CO2, then what?
Solar
activity. Most people would be surprised to learn that solar output during the
20th Century increased to the highest levels in at least 2,000 years—and
possibly 4,000 years. And not only did solar activity go into overdrive during
the 20th Century, but such heightened solar output also corresponds remarkably
well with similar warm periods observed over the past few thousand years.
Researchers
can reconstruct both historical temperature and solar trends, thanks to geologic
proxies of oxygen, carbon, and beryllium isotopes. And through such studies, we
know, for example, that from 250 B.C. to 450 A.D., the earth experienced a
"Roman Warm Period" that coincided with strong solar activity. And
then, solar output declined during the "Dark Ages," leading to
several hundred years of a colder climate. In fact, at one particularly brutal
point in 829 A.D., the Nile River actually froze. However, solar activity
subsequently started to ramp up again a century later, leading to 300 years of
a globally temperate climate known as the "Medieval Warm Period,"
950-1250 A.D.
Temperatures
dropped off precipitously, though, from roughly 1300 to 1850 A.D. During
this colder era — nicknamed "The Little Ice Age" — solar activity
plummeted three times to an absolute minimum. Such lengthy periods of
diminished solar output led to several episodes of a notably colder climate,
with the Thames River freezing each winter and millions of people dying from
famine and starvation. Thankfully, solar activity started to climb back up in
the latter part of the 1800s, leading to the "Modern Warm Period"
that still persists today.
Climate
alarmists are quick to dismiss solar variability as the driver of contemporary
warming, though, and they note that changes in solar irradiance
("brightness") may vary by as little as a few tenths of a percent.
But this overlooks the far more significant, associated impacts of solar
activity. Not only do changes in the sun's output affect the solar wind and
solar magnetic field, but they also have significant consequences for
ionization in the troposphere and cloud formation. Dr. Roy Spencer at the
University of Alabama has suggested that a mere 1 to 2 percent change in annual
cloud cover could account for 20th-Century warming.
Ironically,
there is one way that man might have contributed to global warming, and
that's in the area of stratospheric ozone depletion. From the late 1950s
through the mid-1990s, global production and use of chlorofluorocarbons (CFCs)
led to a steady diminishment of ozone content in the stratosphere. The net
effect was to allow ever-increasing levels of UV penetration, exacerbating the
uptick in global temperatures observed in the latter part of the 20th Century.
It's
noteworthy, however, that once CFCs were permanently discontinued in 1996
(following the full implementation of the Montreal Protocol), stratospheric
ozone levels began to stabilize and global temperatures subsequently leveled
out. In fact, the ensuing "pause" in global warming — where temperatures
have demonstrated a net flatlining over the past 15-20 years — offers strong
evidence that CO2 is not the prime driver of climate change, and that
variations in solar activity bear far more readily on global temperatures.
Significantly, though, stratospheric ozone concentrations still remain
diminished and are only gradually returning to more robust levels.
At
present, what's troubling is that solar activity is now beginning to drop off
dramatically. Two teams of Russian researchers believe that, if solar output
continues to decline, the global climate could once again shift toward a colder
era. They argue that a coming solar minimum could mean a 30-year cold spell
starting in the next decade. If they're correct, the world would be woefully
unprepared for the reduction in agricultural output that might ensue.
With
such humanitarian concerns on the line, and noting the plausibility of a
solar-driven climate connection, it's frustrating and disheartening to see AGW
theory credited as the sole explanation for global warming. It seems logical
that the climate community should give credence to the views of solar
advocates, since both sides are grappling with similar questions. And so,
rather than deride solar variability proponents as "deniers," climate
activists should start to examine all legitimate scientific theories — and give
proper consideration to each.
Terry M. Jarrett is an energy attorney and consultant who has served on
both the National Association of Regulatory Utility Commissioners and the Missouri
Public Service Commission.
No comments:
Post a Comment