The world is a pretty complicated place. Stuff happens and it’s the job of scientists to try to figure out “what the fuck just happened?”
That storm was either the result of complex physical interactions in the atmosphere or, according to indigenous ways of knowing, because the Wind God had lentils for supper. Not sure which of those makes the most sense.
It’s always amazing to me that the astonishing complexity and variety of natural phenomena can be captured by what are a few “simple” laws of physics. OK, they require some mathematical complexity, but fundamentally it’s kind of a miracle that we can write down a few squiggles and explain so much.
Or if you are an epidemiologist or a climate scientist you can put those squiggles into a computer and explain so little.
Let’s not be too harsh on epidemiologists and climate alarmists - they’re adopting a sensible methodology - they’re just not doing a very good job. In principle, we should be able to model things like infectious disease outbreaks and the weather - but the complexity of what’s going on makes that modelling job very difficult indeed. Which is why the epidemiological and climate models tend to be a bit shit.
The process, the idea, of modelling is not wrong, but it’s possible (and usual, in fact) to create a model that is “wrong”. The more complicated a phenomenon you’re investigating, the easier it is to create a model that doesn’t get it quite right.
Crudely speaking, the more interacting variables you throw into the mix, the harder it’s going to be to create a model that matches observation - and the harder it’s going to be to be able to ‘solve’ the model.
If the systems you’re trying to analyse have some feedback and nonlinearity there’s also the possibility that you’re going to have a fair degree of sensitivity to initial conditions - which makes any detailed “prediction” of your model a bit like tossing a coin.
The thing is, you don’t have to go very far to get your scientific knickers in a twist when it comes to more complicated systems. Earth orbiting the sun? No problem - here’s Newton’s Law of Gravitation and Bob’s your uncle. Add in the moon? Ah - now you have what’s known as the Three Body Problem and you’re a bit screwed.
Just going from 2 gravitationally interacting bodies to 3 interacting bodies propels the problem into a whole new class of difficulty. In fact, for most initial conditions the resulting system is chaotic when you step up from 2 gravitationally interacting things.
We can write down the equations that (we think) principally govern the dynamics - we just can’t properly solve the bloody things for anything more than 2 things interacting. In order to make much progress at all you have to solve these things computationally. Which, of course, brings a different set of (technical) problems to do with grid sizes and truncation errors and other lovely stuff like that.
Despite these issues we still managed to get a flag on the moon - or into a Hollywood studio if you’re of a more conspiratorial persuasion.
We humans are, by and large, not very good with complexity. We like simple answers to everything. Despite ‘the economy’ being a hugely complicated thing with lots of variables and interactions, we like to pretend that tweaking interest rates up or down a bit is the answer to, well, pretty much everything.
The complexity, the struggle to understand what’s going on, is what makes science such fun - for me at any rate. I’ve been fortunate to have been able to avoid doing any computational work - although career-wise it might have been a more prudent thing. I’m basically crap at anything but squiggles - and I’m not too hot with those, either.
So, how do we proceed to “get a handle” on complicated stuff?
I’ve been talking about stuff that has complex interactions but even systems which have more (hypothetically) straightforward interactions can be challenging. If you imagine a container of gas then there’s going to be an awful lot of gas molecules in it for any reasonably-sized container.
In a cubic metre of air there are around 10,000,000,000,000,000,000,000,000 molecules. That’s more than the estimated number of stars in the observable universe.
If you think of these molecules as little golf balls whizzing around and bouncing off one another then, in principle, you could write down the dynamical equations (for each one) that would allow you to work out where each one was going to be in the next second, provided you knew where each one was to start with (and where it was going and how fast).
That’s a lot of paper. Not to mention rather a lot of equations we need to solve. The only realistic way to try to figure out the properties is to use a statistical approach. We use things like temperature and pressure, which are essentially statistical aggregates, in order to get some handle on things.
Temperature is basically an average of the kinetic energies of the individual molecules; some are going to be faster and some slower.
This kind of approach works well for the properties of (ideal) gases in containers.
But what the hell is the “temperature” for the atmosphere of an entire planet?
Well, OK, how ‘much’ of the stuff is there?
A quick google search (because I’m too lazy to work it out for myself) yields a figure of the order of 4 billion cubic kilometres for the Earth’s atmosphere. That’s a cube 1km in size - and there’s 4 billion of them.
We could try to assign a ‘temperature’ to each of these cubes in our ‘model’ - and we’d still have to deal with 4 billion of them. Have you ever tried to input 4 billion data points into a computer model? You haven’t? Me neither.
Then we have time. The temperature is going to change with time - so even if we could accurately assign a single temperature to each of these cubic kilometres they’d change and interact with one another and mix as time progressed and we’d end up with a right old mess in short order.
This is before we even start to add in the various things like cloud cover, water vapour, radiative transfer and absorption, surface temperatures, ocean currents, sun cycles etc. You can see the scale of the problem.
But the climate alarmists will give you a single number for the entire planet - the global temperature anomaly. That’s one hell of an average.
The average, most often calculated as the arithmetic mean of a set of numerical quantities, is a really crude measure in a lot of cases. It tells you very little on its own. Yet it gets used an awful lot.
It’s a sensible thing to know - and it can tell you something - but it really should come with a humungous warning label; interpret with caution.
In a recent stack, John Dee pointed out something I wasn’t aware of. It sounds paradoxical at first. It’s possible to have temperatures declining with the average temperature increasing. That needs a little explanation.
Let’s suppose we have our temperature measuring device doing its thing, day in and day out. It dutifully records the temperature at the end of each hour. You’d have 24 data points every day.
What do I mean by “temperatures declining”? All of them? No. I mean the hottest temperature recorded in a day. If you measured the hottest temperature in July, say, over 50 years and found it decreased over that time, you’d want to say that things were getting a bit cooler wouldn’t you?
But here’s the thing. Suppose the minimum night time temperature increased over some time period - and it increased by more than the hottest daytime temperature decreased. What you’d find is that the average temperature (over each day) would increase!
It appears that something like this is happening in some places - and the likely explanation is the urban heat effect. Stuff warms up during the day and helps to raise the night time temperatures as it releases that stored heat overnight.
So when someone tells you the “temperature” is increasing I think the first thing we need to ask is what, precisely, do they mean by temperature?
And even if we’re talking about the hottest value recorded and not just some average, what does it mean to say that in month X in 2023 the hottest temperatures were recorded since records began? Everywhere? By how much?
The really important question is whether this is surprising or not. Stuff like temperature fluctuates. What does it actually mean to say that the temperature in Little Crisping in one day in June 2023 was 0.2 degree higher than the last record for a June day (which was probably in 1976)?
So what?
Maybe some places recorded their highest ever temperatures this year in June, but most will not have done so. It would be interesting to know what percentage of temperature measuring stations across the world recorded their “highest ever” values. That’s something that won’t get reported on because it won’t be a scary number.
This is the thing with quantities that fluctuate. They, erm, fluctuate. Sometimes they’re going to break a record - this is entirely possible and, given enough time, inevitable. What we need to know is whether that record breaking is expected in the normal run of things - and that’s a really tricky thing to pin down properly.
You’ll probably have heard terms like “statistical significance”, confidence intervals, and p-values. They, too, like averages, need to come with a big flashing neon warning label; interpret with caution. These are technical devices that purport to assess whether some result is ‘real’ and not just ‘normal’, expected, fluctuation. I’m not going to go into why these things need to be treated with a great deal of caution - maybe I’ll write about that in the future - but just be aware that like with pretty much everything in statistics these things get horribly misused and abused.
Now I happen to think that the Earth is warming a bit. The single number of the global temperature anomaly is a pretty piss-poor way of showing that, but it keeps the journos salivating.
As I’ve stated before, I’m much, much, less confident that man’s activities are entirely (or even mostly) responsible for this warming. Downright sceptical would be a more accurate description, I feel.
When it comes to the proposed “solutions” to this alleged ‘crisis’ - well, suffice it to say that the words bloody bonkers don’t quite capture my feelings on the matter.
But if we’re going to be spending trillions of dollars on some godawful crappy “green” technology, I think we need to be just a trifle more certain of what we’re doing. Endlessly parroting some decidedly average average like the global temperature anomaly just doesn’t do it for me at all.
Co2 is currently 0.04 percent of the atmosphere, and human activity accounts for 3 percent (apparently) of all Co2. So 3 percent of 0.04 percent is the number. If we just vanished from the face of the earth tomorrow, in theory 97 percent of atmospheric Co2 would still be there. Another thing is that Co2 is probably not capable of retaining heat in the atmosphere. Water vapour is the main "Green house gas" 95% in fact. However all of the data on so called greenhouse gas effects is most likely spurious. The suns cycles are almost entirely the cause of the major changes, with smaller trends due to things like large volcanic actions pushing particulates very high up and causing one or two year temperature drops across regions in the shadow. The Medieval warming period was created by a more active period of the sun and the little ice age a less active period.....neither was due the industrial output of man because they were pre industrial times. We are headed into the already named " Eddy" minimum ( A Grand solar minimum) To see what happens during these minimums see the "Maunder" minimum. During these minimums empires have collapsed including most of the Chinese dynasties.
I think with my Pooh-like brain, that playing games involving both chance (dice) /and/ human interaction-agency is what inocculated me against how models are used as an instead-of method to simply reading the entrails of some sacrifice or other.
This may sound a little obscure to non-war/strategy gamers (before anyone feel like sneering and scoffing, wargaming is simply put using an easily accesible version of game theory among other things, to have fun fighting the best kind of war: fictional ones) but bear with me.
So, you pick scenario, conditions for loss/victory/draw, et cetera. Then you and your mate(s) pick your forces and line them up (and there's obviously oodles of details I'm omitting - the writing-equivalent to prof. Riggers lesson on the dangers of averages above), and roll-off for who goes first.
So far so easy, yes?
Well, turns out you've done the math in your head, yeah? So your squad of heavy machine gunners guarding a mortar detail backed up by a Churchill is guarding a bridge and should be able to hold off your mate's force of pioneers, grenadiers and a PzIII, right?
Wrong. Because he bought an air-drop instead of the spähwagen you expected him to take, so now you have fallschirm-jaegers in your back-field and your battle plan hinging on the average probabilities of your troops cutting his down is out the window.
And woe betide you if you're playing a game with randomised factors like weather (many older ship-of-the-line games) or a Gamesmaster umpiring and running a third side (like the police in a gang-war game).
Imagine keeping track of at least 20 different types of units, each able to have 10 or more variables of equipment and deployment, on at least 12 different types of maps for 20+ scenarios, all based on six-sided dice and each respective unit's probabilities for killing others/surving being attacked - that's wargaming. People who do that for a living or as a hobby for decades?
Imagine how skilled they become in immeditely finding loop-holes and breaking points in any system, especially since they also play chess, Go, Hnefatafl, or any other fixed game system to learn how to optimise when there's no randomising factor.