Global Warming?

If one were really interested in a warming/cooling, then one would measure the enthalpy (heat content) of the whole “climate system”.

As it stands, the surface temperature analysis is based on the “average” temperature of each day, computed as the temperature half-way between maximum and minimum… that’s only valid arithmetically. The average heat content at that location for the day is not necessarily close to that. e.g. it may only be maximum temperature for an hour a day, but (near) minimum for 20 hours.

A state of the climate based on the average temperature may as well be entirely fictitious. Adding lots of them together doesn’t make it less of a fiction.

Automated weather stations record temperature and other atmospheric, insolation and soil condition at least once every hour; sometimes once a minute. The necessary data volumes are easily managed by today’s computers, communications and storage technologies.

The heat content of the climate system isn’t just in the dry air over time. One has to measure moisture content and soil-/water-surface temperature for a start. Then, for each component, calculate enthalpy over each area (specifically, the thermal mass of each component). That gives the “instantaneous” heat content for the measured region.

Do that for the whole globe. Then sum for the global total at that instant.

It’s that simple. Meticulous and rigourous, but simple.

What one then does with the enthalpy snapshots is up to the researchers … plot against time to see how it changes over long periods. But any condensation into buckets of arbitrary size – say a year, and one “corrupts” the data with an implicit or explicit assumption. Such condensed results are useless for further analysis. They are only data inasmuch as being stored numbers. Their tenuous link to the real world has been broken.

The necessary data collection could be done if less money were spent on fantastic climate models and more on reliable measurements. One doesn’t get a statistically-valid representation of global weather if one only measures where it’s convenient. Nor if one has the habit of massaging raw data until it complies with assumptions.

Advertisements
This entry was posted in It's only a Model. Bookmark the permalink.

3 Responses to Global Warming?

  1. Eric Meyers says:

    If only more people would consider the way so-called trends are deciphered from data, not to mention the utter lack of empirical proof that CO2 emissions are causing global warming.

  2. tchannon says:

    I agree entropy is all that matters whereas there is an obsession with near surface temperature including failing to take into account atmospheric thermal mass variation, primarily to do with water content.

    My reply here is for the following reason.
    If the real diurnal or annual temperature profile is signal processed instead of the usual pocket calculator stuff there is no magic different answer. Yes the standard method is wrong but the main effect is introducing artefacts because of the failure to handle the spatial nature of a digital representation of an analogue quantity other than crudely. Put differently the amplitude is handled but time is improperly. (should blur during decimation)

    If it helps I did an experiment which gives some idea of the difference. If you click on the plot (link follows) that is what it looks like. Hourly data over a year. An accidental inclusion (didn’t realise at the time) is the late autumn temperature anomaly, more or less the coldest snap on record in the area.

    An effect is that correctly any point in the record notices the past and future, this is the decimation in time at work but makes things very difficult in producing an up-to-date record. Also notice the much lower noise where at face value it could be assumed to be caused by filtering but this is subtle, is as much about not containing fictitious artefacts.

    Post is far too old for comments.

    http://tallbloke.wordpress.com/2011/10/22/signal-processing-and-normal-meteorological-maths/
    (save puzzlement, co-moderator at the Talkshop)

    • I had to drag your response away from the gnashing teeth of the anti-spam wolves. 🙂

      The errors in recording weather are of biblical proportions. Very much of Australia’s “high quality” weather records is low-quality. Temperatures that were supposed to be within 0.1°C are suspiciously clustered at 0.0, 0.1, 0.5 and 0.9 from the whole number values. “Decimation” is insufficient to describe the magnitude of the error.

      Whereas the 0.0 and 0.5 can be explained by inaccurate readings, the 0.1 and 0.9 result from unit conversions of temperature scales.

      See “Near Enough For a Sheep Station”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s