From Malthus to Boltzmann Economics


Historians credit Thomas Malthus as the first modern economist. Although a religious leader by training, Malthus combined original thinking, widespread observation, and statistical reasoning to confront changes in society and nature. He observed that farmers increased output gradually, but reasoned that human population grew exponentially, while industry expanded geographically. Agriculture lost land, gained surplus slowly, but had to feed many more people. Malthus predicted famine, sooner or later, as exponential population trends outpaced incremental agricultural development.

This 18th century perspective became a touchstone for future economists. Adam Smith’s logic foresaw trade as the safety valve, with different places providing superior specific outputs, whether land, goods, or services. Their networked flows produced the kind of exponential surplus that might overcome Malthusian catastrophe. Marx supposed the opposite, with monopolists cornering surplus, producing Malthus’s scenario with a vengeance. Later economists came up with a solution. While shortages would spike prices and create famine, they would also cause extraordinary profits and suck investment into agriculture, producing more food through intensive or extensive change.

But the 20th century continued Malthusian anxiety, for good reason. Populations were reaching previously unimaginable figures. Economists at the Ford and Rockefeller foundations pushed technologists to innovate crops, fertilizers, and irrigation services. These institutions were not willing to let famine prices trigger agricultural change; they assumed those famines were inevitable, and that itself was reason enough to change.

Malthusian archetypes, population-based corruption of natural resources, underlie much modern environmentalism. From Erlich’s Population Bomb to China’s one-child policy, by the 1970s Malthusian ideology challenged the world. Although in its strict environmental character it concerned famine and biosphere degradation, economists used it to predict poverty. Money, of course, has long been economists’ scarce resource.

Malthus’s insights remain important, but are often supplanted in the current world by Boltzmann’s Theory of Thermodynamics, and its 20th century derivations in complexity and information theory. Entropy, the second law of thermodynamics, posits that energy moves towards random disorder. The universe expanded viciously, creating localized order from chaos, which spilled disorder abroad. This evolved more refined complex orders, which sprewed even more disorder. The process is both micro and macro, with orderly equilibria holding up until energy inputs falter or externalized disorder collapses inward.

Shannon inverted entropy equations to explain information. Information is unusual order, because it’s unexpected. Random signals have entropy; the probability that signals code specific meaning measures an increased level of non-randomness, which is information. The correlation of the unexpected with information means entropy is boring. Information equilibria remain culturally determined, subjective and imprecise; unlike the eddies of negentropy in chemical, biological, or physical equilibrium orders.

These two ideas, entropy and information, reflect contemporary human activities, economic and environmental, at least as well as the resource-constrained Malthusian view.

Unnoticed by many, population has ceased to be the siren call of catastrophic futurists. Only 15 years ago, the mainstream pilloried economist Julian Simon, who claimed population increase was good for economic development. Today that’s an accepted truism. While environmentalists often disagree with a demand-centered model of social change, they find themselves most often battling supply-side economists. This isn’t merely anachronistic theory debates: environmentalists stake their policies on popular democratic efforts that get stymied by resource-focused interest groups.

Whether population increase is good or bad depends on many things. But the large issues facing humanity may get better analyzed in other ways. To illustrate, the following uses sport utility vehicles (SUVs) as a symbol of entropy/information economics.

Entropy is held back with new energy inputs. Most entropy from energy inputs to human activities are from friction. In a rough, Newtonian way, entropy = friction.

When a car drives on a road, energy overcomes friction. Air resistance, tires on road, and engine parts cause friction. Friction increases entropy – heat, less usable energy, CO2.

If you start to drive a vehicle, energy overcomes tire friction. As the vehicle moves, it takes less energy to keep a constant speed. Friction determines the energy needed to keep a steady velocity.

Vehicle Weight
Friction ≈ —————–
Applied Energy

Because friction is a molecular force determined by specific material properties that behave discontinuously as relative velocities change, the actual friction of two objects must be empirically specified.

The more friction, the more entropy created. The heavier your car, the more entropy created.

If a car were powered by a gasoline engine without internal friction or heat loss, it might get 100 MPG. Hybrid vehicles capture brake friction, and use it for acceleration. Entropy is avoided because brake friction does not dissipate energy into heat.

Which brings us to the other side of entropy. If we can minimize climate entropy by driving a lighter vehicle, then why don’t we? Because it doesn’t provide as much information to feel safe and secure.

The success of the minivan and SUV began with consumer demand for secure family vehicles. But consumers choice is a proactive affair. Even if the price of gas increases market rejection of SUVs, it will inspire interest in something that meets similar needs.

The driver’s perspective rules. It is not scientific data that makes SUV drivers feel safe and secure, it is an elevated position that feels impervious. Note that being higher means seeing farther, a secure advantage. Sitting straighter helps keep awareness sharp.

For more than 20 years minivans and SUVs have been the fastest growing passenger vehicle segment in America. That success is not a random event. Its #1 reason is that these vehicles provide the experience of being safe and secure. Whether the buyer wants suburban fashion or off-road rigor, SUVs stand out (in three dimensions) as the most secure in consumer perception – regardless of safety data. Different styles may pull or haul or seat more, let the driver look further down or farther out, add more bling to the burbs, but these parse a market founded on security and safety, the antidotes of fear.

Most Americans – 55% of those surveyed in a TIME/CNN poll – believe SUVs are safer than cars because of their sheer size. GM research says that among the top reasons consumers give for picking a full-size SUV are that it seats more people and that the size makes them feel safe. An Associated Press poll in 2003 found that Americans think SUVS are safer than other vehicles – for people driving or riding in them.

Safety is the result of information. The SUV reduces uncertainty because the driver can see farther, and because the people in the car perceive its bulk. The extra awareness is information, and that information produces security.

Communication theory models information as the mirror image of entropy. If entropy and information meet at an inflection point, it’s where SUVs reduce uncertainty with information, and increase entropy with friction.

Amory Lovins, the energy efficiency leader and head of the Rocky Mountain Institute, discussed the entropy/information derivative in a recent TED talk, titled “Winning the oil endgame”, part of which I quote:

“In a standard sedan, of all the fuel energy you feed into the car, seven-eighths never gets to the wheels, it’s lost first in the engine, idling at zero miles a gallon, the power train and accessories. So then of the energy that does get to the wheels, only an eighth of it, half of that, goes to heat the tires on the road, or to heat the air the car pushes aside. And only this little bit, only six percent actually ends up accelerating the car and then heating the brakes when you stop. Moreover, three-fourths of the fuel use is caused by the weight of the car. So there’s huge leverage for making the car a lot lighter.

The reason this has not been very seriously examined before is there was a common assumption in the industry that it might not be safe if you got whacked by a heavy car, and it would cost a lot more to make, because the only way we know how to make cars much lighter was to use expensive light metals like aluminum and magnesium. But these objections are now vanishing through advances in materials.

We could make cars bigger, which is protective, but make them light.”

Lovins uses Boltzmann’s language, discussing energy lost to heat, and touches base with Shannon when he points out that lighter cars appear dangerous. His final sentence is like a supply/demand graph, but safety and weight the two offers. Since safety is in the eye of the beholder, it comes from information; since weight is an issue because of gravity, it generates entropy.

Lovins doesn’t use an entropy/information economic phase space, however, or at least not in this presentation. If he did, the discussion might expand from safety and vehicle size, to other factors that convincingly inform a driver she is safe.

Carbon fiber costs around $8 a pound, steel $1. Because information input – the driver’s basis for feeling safe — combines psychology and physics, an optimal economic solution may not be simply better materials. Drivers’ sense of safety involves self-efficacy and control-capacity. They measure security, in information theory terms, by considering how much they can reduce uncertainty.

As entropy requires a frame-of-reference for measurement, so does information. But if friction creates entropy, information reduces uncertainty. A vehicle that provides exceptional control, through automated response to its own sensors and external data, reduces driver uncertainty. This reduces driver requirements for new information, collected, from high perched seats in an SUV.

In Boltzmann economics, the entropy cost curve increases with vehicle weight and size, while uncertainty decreases. Low-weight materials might make the entropy curve shallow, but it won’t extend to low-cost vehicles. Improved vehicle control systems may also cost too much for ordinary cars, but these systems may emerge from many sources, whether corporate, government, university, or international R&D centers, not to mention drivers, so future costs are ambiguous. Ideally, a Boltzmann economist could plot the cost of a car’s information based security provided to drivers, and a measure of it’s entropy production.