go to homepage

ENTROPY

by Miles Mathis



First written March 2005

"Entropy" is one of those scientific words, along with "relativity", "infinity", and "field", that has acquired an extraordinary amount of baggage since its initial defining. So many non-scientific and pseudo-scientific concepts have been attached to it that it would be almost unrecognizable to the early pioneers of heat theory. In the beginning, entropy was just defined as a relation between heat and temperature, using the equation dE = dQ/T. That is, it was mainly a measurement of heat change. It is a fancy way of saying that heat always dissipates, or moves from a hot region to a cold region. On the molecular level, this is just to say that motion tends to dissipate—since heat is molecular motion. And this in turn is just a fancy way of saying that molecules tend to move from where they are to where they aren’t. Not a deeply philosophical issue. It is straight statistics. Molecules move into cooler areas because cooler areas have more space. Statistically, they are more likely to go there simply because there are more potential pathways.
       Think of it like this. You are parked right in front of your garage, with the garage door open. You are blindfolded. You don’t know whether the car is in "drive" or "reverse". You don’t know if the car is pointed in or out. You also don’t know if the steering wheel is straight or if it has been turned one way or the other. You are asked to calculate the odds for two events. In event one, you go into the garage. In event two, you go anywhere else. Obviously the odds of going into the garage are much less, since you must satisfy a very specific situation—that being “steering wheel exactly straight” and “drive”. Any other combination of presets will put you in event two. This could be stated as a difference in total number of potential paths, or it could be stated even more concisely: the garage is small, the rest of the world is large.
       This being so, it is no surprise to find out that entropy always increases. Philosophers and physicists and mathematicians alike have long treated this fact as some sort of metaphysical conundrum that required explanation, or that had huge teleological implications. It has no implications beyond the sentence “things move”, since it logically devolves into that. Things move into open spaces because open spaces are open. It is a tautology. It is definitional. Things are less likely to move into less-open spaces because in less-open spaces they are more like to be resisted. Things don’t go as easily into partitioned spaces because the partitions resist them. They don’t go as easily into crowded spaces because the other objects resist them. Things go where they can go and don’t go where they can’t go. There is no mystery to be solved.

Every time you hear a contemporary explanation of entropy you are treated to a lecture on order and disorder. You are always given several examples, one of which is likely to be a disordered room. A tidy room has more order, you are told, because it required energy to put everything in its place. Without that ordering energy, the room tends to revert to disorder. The room therefore moves against disorder and against the natural flow of entropy only by human intervention.
       But this whole analogy is faulty from top to bottom. It is completely anthropocentric. It assumes that the universe cares whether a room is tidy or not. But the universe does not define order by any human measure of neatness. For example, in a tidy room, the books are all in the bookshelf; in an untidy one, they are on the floor and on the bed and on the dresser. Can the universe differentiate between the two situations, based on a measure of heat? Of course not. Can the universe differentiate between the two situations based on any scientific concept at all? No. Like things being put with like things is not a measure of order to anyone but humans. And even with us it is mostly arbitrary. In a tidy room, the lamps may be on opposite sides of the bed. Would we say the room is more tidy or ordered if both lamps are stored side by side? What about the argument that lamps on opposite sides of the bed are more symmetrical? Most people believe that symmetry is a type of order. This would make the books in the bookshelf asymmetrical, since there are no books on the other side of the room to balance the composition. We already have several competing definitions of order, you see, and I have just touched the surface. And none of these definitions has anything at all to do with heat or molecular motion.
       Let us take another example. Science books often mention a shaker with a level of salt and a level of pepper. This is the example of order. You then shake the shaker and the two layers mix, creating disorder. But this definition of order is only useful to someone who is in need of pure salt for his steak, or pure pepper for his stew. The universe does not find anything especially ordered about two distinct layers. As a measure of heat content, the argument is a non-starter. But even as a measure of order on the macro-scale, the argument is absurd. Some will argue that it would take more energy to separate the grains than it did to mix them, but this is once again a human argument about human tasks. Yes, it would take a lot of energy for a human to separate the grains by hand. But a machine could be devised that could separate the grains using less energy than the shaker (using only gravity, say). So the argument is not really to the point. It does not finally matter how much energy is used to complete either task, since these tasks do not define entropy.
       Let me offer a third example. Say you have ten married couples at a dinner party and one very large table. Is it more ordered to seat them as couples—that is, man-woman, man-woman, etc. Or is it more ordered to seat all men on one side of the table and all women on the other? The second choice is like the salt/pepper shaker before shaking or the tidy room. Supposedly very ordered, since like is with like. But I think most people would say the first choice is more ordered, since it preserves the symmetry. The second choice is unbalanced. It would be especially unbalanced if you took into consideration the weight or the heat involved on each side of the table. The men are likely to weigh more, eat more, sweat more, and therefore have more molecular motion. I am obviously being droll. Entropy is not about any of these things. Neither choice has anything to do with entropy, since they are both concerned with human choices at the macrolevel. But it is clear that for every example you can give proving disorder, I can use it as an example of order.
       In fact, I believe, in all seriousness, that entropy moves all states and all situations, micro or macro, into levels of greater order, in one very important sense. We have already seen that with each of the examples above. In the room, random movement would tend to spread books across the entire available space. In the shaker, shaking tends to mix the salt and pepper randomly, breaking up asymmetrical conglomerations of salt or pepper. With the men and women, random movement would tend to mix the two groups, bringing men and women together. We might not get the couples with their correct spouses, but we would be more likely to find them seated man-woman than not. In all these examples, balance and symmetry are created. In this way, one could easily argue (and I am arguing) that randomness is the ultimate order. Randomness is not disorder, it is the most complex form of order. It is an order which takes into account all the existing entities and events, not just a chosen few of them.
       This definition of order is 180 degrees away from the common definition, or the artistic definition. In common parlance, a Jackson Pollock is not said to be more ordered than a Rembrandt. A chaotic room is not considered to be highly symmetrical. But these definitions are anthropocentric and unscientific. Their content is determined by our human desires, not by predispositions of the universe.

By attaching the facile theory of disorder to the concept of entropy, contemporary science has given a religious aura to the term. Entropy has come to be seen as destructive, even Satanic. In a Manichean universe, entropy becomes the mysterious force of dissolution. Entropy has been mentioned in acts of Congress, there treated as a maleficent force or dissolving trend to history. But it is nothing of the sort. If anything, it is one of the fundamental forces of creation. Entropy is a measurement of motion on the molecular level, and tells us only that things move into open spaces. To the degree that entropy exceeds tautology, it defines only one thing: the tendency of molecules to disperse. If we extrapolate this to include all bodies, and their tendency to disperse given random movement, then we have a “force” that resists clumping. We have an “unclumping force”. Already, I would say this definition is too abstract, but if we let it stand, we can see how it becomes a creative force. It becomes a creative force by balancing the other great universal force of clumping—which is of course gravity. Whatever mechanism or theory you propose for gravity, it is always going to be a force (or motion) that tends to clump. On a universal and macro-level, if you have only entropy the universe expands without limit and disperses into a zero energy field. If you have only gravity, the universe becomes one huge clump, and potential energy dissolves to zero. But entropy and gravity together create a closed circuit that is potentially infinitely renewable. The tension between the two forces drives all events. In a sense, it causes all events at the macrolevel.

Evolution has also suffered from its contact with entropy. Current wisdom is that evolution is unentropic, and that its existence is explained by evolving systems excreting or discarding extra entropy into their environs. That is, lower or negative entropy regions can exist only by creating higher entropy regions around them. In this theory, evolution is a machine that reverses entropy, stealing negative entropy from its surroundings.
       It doesn’t taken much effort to explode this theory, since it is easy to see that we are once again dealing with a human conception of order and not with heat content. Stars also “evolve”, supposedly coalescing from huge fields of gas. They gain form, order, and heat over longs periods, which periods are unentropic. Obviously this is due to mainly to gravity, but few have tried to define gravity as unentropic. The whole universe is permeated with gravitational fields, which would make the whole universe unentropic. Everything that exists, all bodies that have become bodies—whether organic or inorganic, alive or dead—have overcome dispersion, and therefore entropy.
       Human evolution, by contrast, is a theory of species, a theory that has nothing to do with heat content or molecular motion, and almost nothing to do with universal order. For instance, it is accepted by science that man evolved from apes. Are humans hotter or more ordered than apes? Than ostriches? Than sharks? Our brains are slightly more complex, but complexity is not a measure of order. It might be argued that complex things are more disordered, since they often show less symmetry. Besides, species evolution is not strictly a measure of order or complexity, even in a limited sense. Reversions happen all the time, since evolution responds to specific environment, not to a rigid hierarchy of complexity or order or intelligence. For instance, humans have a diminished sense of smell (as well as hearing, sight, etc.) compared to many other animals. Have we devolved? Have we become cooler? Have we become less ordered? No, none of that. Not all change is a direct outcome of entropy or unentropy. At levels of greater complexity, random changes happen and stick for other reasons, and for longer periods of time. In cramped or clumped quarters, events of low odds happen with greater frequency, and strange combinations may emerge. Human evolution, like star evolution, is driven at a foundational level by gravity, and could not happen in dispersed fields. Such evolution can happen only in fields of relative stasis, and these fields occur only in certain gravitational fields. Life does not feed off of unentropy; nor does it excrete higher levels of entropy.
       It might be said to feed off of gravity, which is axiomatically unentropic, but this must be stated and understood with more precision. Entropy itself is the same everywhere. It does not vary from region to region in the universe. Within a star or within an evolving creature, molecules still tend to disperse. This tendency is counteracted by gravity, as well as by other complex interactions that rely on gravity (and atomic forces). But the tendency is not absent. The tendency to disperse is a universal constant, since it is simply a function of molecular movement, and of the definition of movement itself. Minus gravity and atomic forces, entropy tends to increase everywhere and at all times. But the universe is not lacking gravity and atomic forces. Everywhere that gravity exists—which is to say everywhere—entropy will be nullified to one degree or another.
       All bodies, organic or inorganic, were created by overcoming dispersion. Everything that exists is therefore unentropic in this sense. Even a gas is unentropic, since it will be experiencing gravitational forces. Gases tend to coalesce over long periods. If they didn’t, nothing would ever form. We would not have stars and planets and galaxies and the rest. This means that the whole universe and all parts of it are both entropic and unentropic. They are entropic in that they all experience random motion at the molecular level, a motion that tends to disperse them. They also often experience random or nearly random motions at the macrolevel. But despite this they still clump, mainly due to gravity.
       As a matter of heat, all systems are entropic. As a matter of final combination, they are unentropic. Taken alone, entropy always increases. Taken as part of the set (entropy+gravity), entropy is always the slightly smaller partner. In every body that exists, gravity has overcome entropy.

This argument holds at the atomic level as well. At this level it is not gravity that has overcome entropy but electrical and nuclear forces (which are likely to be an analogue of gravity). Subatomic particles also tend to be in motion and therefore tend to disperse by the simple definitional laws of motion. That they do not actually disperse is due to the various other forces at work at this level of interaction. The mechanics of these forces is mostly unknown, but like gravity they are attractive forces. Therefore a similar circle or circuit of tendencies is created, causing the potential for stability and therefore complex combination.


If this paper was useful to you in any way, please consider donating a dollar (or more) to the SAVE THE ARTISTS FOUNDATION. This will allow me to continue writing these "unpublishable" things. Don't be confused by paying Melisa Smith--that is just one of my many noms de plume. If you are a Paypal user, there is no fee; so it might be worth your while to become one. Otherwise they will rob us 33 cents for each transaction.