Predictably, right after news of Hurricane Joaquin broke, and that it had the potential to be another Sandy, liberals were quick to blame it on manmade global warming, or “climate change.” Daily Beast contributor Michael Shank, PhD blamed Joaquin on us and our out-of-control carbon emissions:
Hurricanes love warm water and the sea surface temperatures in Joaquin’s path are the warmest ever on record.
Sea surface temperatures are merely reflecting this trend, expanding the five oceans as they absorb more heat from the atmosphere. To date, sea levels have risen roughly four feet over the past several centuries, which makes for even more ferocious storm surges when they land on shore. (And the seas aren’t stopping; they keep rising.)
But warmer air, of which we now have plenty, also loves water, holding more and more moisture as we heat up the planet. Couple this trend with melting ice sheets and glaciers and you see that hurricanes, such as Joaquin, have no shortage of precipitation.
By now, in the United States, this trend shouldn’t surprise anyone. Hurricanes Katrina and Sandy should’ve sufficiently educated everyone on extreme weather trends. What people may not have realized, however, is that the “extreme” will be getting more “normal”—that is, more frequent—and that is a terrifying prospect.
A study published this week by the Proceedings of the National Academy of Sciences says that we’ll see superstorms (on a par with Superstorm Sandy) now every 25 years instead of every 500 years.
Don’t let Joaquin do what Katrina and Sandy did and die off, years later, in the minds of many Americans. Make it motivate a nationwide emergency declaration that climate change is happening here and now. Precipitate that.
At the same time, the U.S. hasn’t sustained a major hurricane in ten years. We’ve had some close calls, but no category 3 or higher hurricane has made landfall since Wilma in 2005. For that reason, this past decade is referred to as a hurricane drought. The past several years saw record low hurricane occurrences.
And no, last year was not the “hottest year on record.” Of course, it all depends on which dataset you use. And which dataset you use is determined by whether or not you already believe in catastrophic manmade global warming. If you do believe in it, then naturally you’re going to use the surface temperature stations, most of which are positioned on asphalt parking lots or in close proximity to an artificial heat-generating source like an air conditioning unit, a city building, or a transformer, in violation of the National Weather Service’s own rules on surface station siting.
If you don’t subscribe to the global warming alarmist mentality, then you’ll probably view the surface temperature data as biased, considering how most of the stations are positioned in artificial “hot zones,” rendering the data totally worthless. You’ll probably refer to satellite data, which offers a much more accurate, comprehensive and global picture of the Earth’s average temperature. In spite of this method being the most technological, up-to-date means of temperature data collection, the alarmists don’t prefer it. Well, unless they’re using it to show the melting North Pole. Other than that, they have no use for it. Even NASA eventually admitted that they were only “38% sure” that 2014 was the hottest year on record.