Readers can actually estimate the impact of global warming on hurricanes for themselves. Here we use kitchen measure rather than metric.

In his New York Times editorial of Sept. 12, “Irma, and the Rise of Extreme Rain,” columnist David Leonhardt published a graph of global yearly average surface temperature from 1905 to today. These averages were taken from actual measurements around the world (some scientists, for example Dr. Charles Keeling at the Mauna Loa Observatory, actually spent a lifetime measuring temperature and other atmospheric and ocean data).

Converting Leonhardt’s graph data to Fahrenheit, the ocean surface temperature has increased just a little over 2-degF in the hundred years between 1917 and 2017. The graph is bumpy, but none of the bumps or dips are at all far off the line gradually moving upward.

Now as all of us who watched MSNBC or public television learned during this 2017 rather impressive hurricane season, the energy of a hurricane is gained by heat-transferred from the water over which it travels. This is a dynamic and complex phenomenon, but the weather folks, using super-computers, can model this process sufficiently to reasonably accurately predict the path, the strength and the behavior of a hurricane.

Yes, there were slight corrections that had to be made as the various hurricanes proceeded this August and September across the Gulf and Atlantic to landfall, but the predictions were startlingly accurate. Here we will not attempt anything that complicated. We will simply ask one question: For a 150-mile diameter hurricane, how much difference does an extra 2-degF make in its power?

A water heater measures heat in British Thermal Units (BTU). One Btu is the energy required to raise 1-pound of water 1-degF. If you have a gas water heater, you get your bill in dollars-per-therm, the cost of gas to produce 100,000 BTU. If you have an electric water heater, your bill is the cost of the number of kilowatt hours to do the same work. Here we only consider the extra energy 2-degF warmer water adds to the hurricane. So that comes to 2-BTU for every pound of water at the surface of the ocean.

When scientists make simplifying assumptions, they always simplify in the opposite direction of what they want to show. We know that the ocean also adds heat from below the surface layer, but that requires complicated heat transfer equations and we would need a powerful computer. So we will consider only the surface transfer of heat. Now a pound of water is, the world around, a pint of water. And the volume of a pint is 0.0167-cubic feet. Which tells us that if we consider the top 0.0167-ft (about 1-quarter-inch) of ocean water, 1-square-foot in area, as the source of heat energy for the hurricane – we can roughly estimate the average extra energy gained from every square foot of today’s 2-degF warmer water as 2-BTU extra energy per square foot of hurricane. For a small 150-mile diameter hurricane, that comes to around 500,000,000,000-BTU.

That is a big number, but energy can be expressed in all sorts of ways. If you look around on the Internet, you can find a conversion of energy in BTU to energy in kilotons of TNT explosion. I ran the conversion and came up with 126-kilotons. The Hiroshima bomb, called “Little Boy,” was estimated as 15-kilotons TNT. So the extra energy gain from 2-degF water warming a 150-mile hurricane is roughly the equivalent of eight atom bombs.

Of course, hurricanes do not stay in one place. They move across the water slowly. So if we stay with our simple model, every 150 miles a 150-mile diameter hurricane moves adds another extra eight atom bombs of destructive power.

*This article first appeared in the October 11, 2017 issue of the Rossmoor News. Author Wayne Lanier can be emailed at wlanier@pac-bell.net.*