Wednesday, January 02, 2008

It's the New Year and time to Re-start Science

In the book "The End of Science" chapter #8 "End of Chaoplexity" John Horgan writes..

"Definitions of complexity typically draw upon thermodynamics, information theory and computer science and involve concepts such as entropy, randomness and information -- which themselves have proved to be notoriously slippery terms."

Note that for John Horgan -- the science writer -- and apparently all others that he knows in science, there exists a scientific problem of complexity being "notoriously slippery."

"..all definitions of complexity have drawbacks. For example algorithmic information theory (Chaitin et al) holds that the complexity of a system can be represented by the shortest computer program describing it. But according to this criterion, a text created by a team of typing monkeys is more complex because it is more random and therefore less compressible -- than Finnegans Wake"

The solution of course is very simple. The "typing monkeys" are merely an anthropomorphisation of "randomness." "Randomness" is merely a uniform probability distribution over a set of possible outcomes. That is to say, if the probability of your slot machine's jackpot configuration (say, three lemons) is greater (or less) than the other configurations(say, two cherries and a banana) then its internal probability is distributed non-uniformly -- it is skewed/biased and it is subsequently non-random.

Horgan's "text" being "created by a team of typing monkeys" is but uniformly distributed, unspecified gibberish wherein the complexity here is an attribute of the complex (alphabet) system -- not an attribute of randomness! Randomness is by definition uniform and subsequently non-complex. "Finnegans Wake" on the other hand is specified, non-random and does not conform to a uniform distribution of letters.

Even if one were to remove all the redundancy from "Finnegans Wake," reducing it to a K-complex skeleton code, it still would not be the least bit random because any such "Finnegans Wake" is utterly specific and randomness is devoid of specifications. Any such Finnegans Wake code, non-randomly specifies Finnegans Wake -- IE it is a specific, non-random list of the specifications/instructions needed to produce the specific Finnegan's Wake text.

While the letter "A" is indeed specific, randomness is not. While letter and number systems are themselves complex and specified -- randomness is not.

Randomness is the absence of order, the absence of complexity, the absence of specifications and the absence of information. When expressed in a complex system, randomness appears as uniformly distributed noise or "gibberish." John Horgan bemoans the "end of science" -- not because science is ending and no more answers are possible -- but because he is talking to the wrong scientists -- scientists who worship randomness. For those who don't worship randomness, "complexity" is not the least bit "slippery."
Note added Jan 3/08. I thoroughly enjoyed reading "The End of Science." John Horgan is a very entertaining writer and I found the subjects he addresed therein to be extremely interesting. -WB


oleg said...

Hi William,

Glad to hear that you enjoyed The end of science, Horgan is indeed a good writer. He's a bit too pessimistic about science in my view, I think science will be fine. Particle physics may be nearing a high, impenetrable barrier, but there are other areas of physics that are thriving: astrophysics, condensed matter... And look at biology: this is where there is a lot of excitement.

I must, however, disagree with you about specified complexity. Sure, Finnegan's wake is structured and highly redundant as any English text would be. But that alone doesn't make it meaningful and profound, and I think that's what you mean. You can teach a computer to generate
acceptable prose, which you probably wouldn't call complex.

Besides, you can compress an English text into a zip file effectively removing redundancy. The resulting bits will look random to an unsuspecting observer, just like the output of a monkey's typewriter.

So yeah, I am afraid complexity is hard to quantify.

William Brookfield said...

Hi Oleg,

Good to hear from you. Congratulations on getting your recent paper published in Nature.

John Horgan -- 1996 “all definitions of complexity have drawbacks”

Oleg – 2008 “I am afraid complexity is (still) hard to quantify.”

12 Years later and orthodox science is still struggling with complexity (the opposite of simplicity). Maybe John is right and all significant scientific progress has stopped!

Seven Hawking 1988 -- "The second law of thermodynamics has a rather different status than that of other laws of science, such as Newton's law of gravity, for example, because it does not hold always, just in the vast majority of cases.”

Any progress on finding a real second law that does indeed hold always? Not from orthodox science! 20 years later and orthodox science is still limping along with a faulty second law ("that does not hold always") and no GSL. Maybe John Horgan is right and scientific progress has indeed ended.

Horgan -- “entropy, randomness and information -- which themselves have proved to be notoriously slippery terms."

Science of course can hardly function on the basis of such dreaded slippery (unscientific) terms. Luckily I have identified the source of all this slipperiness and science can once again progress :-). Orthodox science it seems has been limping along with an order-adulterated definition of “randomness.”

Lets see what happens when “randomness” is strictly defined and slipperiness is no more…

When “randomness” (a uniform probability distribution) is forced to appear in a digitized/quantized (and a subsequently non-uniform environment) there is a collision between the uniformity and non-uniformity resulting in a K-complex digital train-wreck. Luckily no passengers (no information bits) are killed here because a uniform distribution (being uniform) contains no information. Thus while this maximally twisted train-wreck is maximally Kolmogorov complex, there is no information (dead or alive) to be found here folks (so it is best to just move along :-).

On the other hand Finnegan’s Wake (or a compressed K-complex string that codes for it) is not a train-wreck. The Finnegan’s Wake text conveys information. As specified information it is fragile and as such it requires error-checking for "the enemy of information," randomness that always threatens to randomize/scramble/damage information and turn it into a trainwreck.

Randomness is non-specific and scrambling a maximally random “train wreck” to produce another digital train-wreck does not damage a train-wreck. A wreck is a wreck is a wreck. Unlike information, randomness is not fragile.

In summation, Kolmogorov has provided science with a fine (non-slippery) definition of complexity (measured in bits) that applies to both, non-informational {m}k-complexity (digital train-wreck) and informational {i}k-complexity (digital information). As far as I can tell “complexity” has long since been scientifically quantified. Kolmogorov’s definition is also fully consistent with orthodox dictionary definitions that have “complexity” as the opposite of “simplicity.”

oleg said...


There is no problem with the 2nd law of thermodynamics. Its probabilistic nature is not a bug, it is a feature.

Thermodynamics and statistical mechanics deal with large ensembles of particles. Mathematical equations describing physical systems in thermal equilibrium become exact in the limit where the number of particles (or the system size) becomes infinite. So the mathematical formulation of the 2nd law is impeccable as far as logic is concerned. There is no need to correct it.

Of course, real physical systems are never infinite and they always contain a finite number of particles. Is a theory that only becomes exact in the thermodynamic limit (of infinite size) useful or even applicable to finite systems? The answer depends on how much error is produced when the system becomes less than infinite. It turns out that the error is typically not even small, it's miniscule, microscopic, tiny, and in fact hardly observable in macroscopic systems. So thermodynamics, in addition to being a mathematically rigorous theory (well, from the viewpoint of a theoretical physicist), it is also very precise in the practical sense.

To understand why the error is small, go this web site and perform a numerical experiment. Put N elastic balls inside the box and let them bounce off the walls and collide with each other conserving energy in the process (to make collisions elastic, set bounce at 1.0). Divide the box mentally into two halves of equal volume. On average, you will have N/2 balls in each half, although the number will fluctuate. If the number of balls is small (say, N=4), from time to time all of them will gather on one side of the box. As the number N grows, such exceptional events will become exceedingly rare. With N as small as 20 you'll have to wait more than a day to observe all of the balls gather on one side. With 60 balls the time scale will stretch to the age of the Universe, which for us, mortal humans, essentially means never.

That's why statistical mechanics is so reliable. It deals with ensembles including billions upon billions of particles, in which case the problem of recurrence (all balls on one side) happens never, even on the time scale of the Universe.

To recap, the theory of thermodynamics is not approximate, it is exact in the limit of infinite N. That means that there is no formal reason to fix it (it ain't broke). And while in the strictest sense it is valid only approximately for finite systems, the error is so small as to be immeasurable in practice, so there's no practical reason to fix it, either.

Best wishes,

William Brookfield said...

Thanks for you post Oleg.

I wonder if you could try to post that link again. The readers might enjoy running experiments with elastic balls.

I do realize that the uncertainty is hyper-astronomically small for standard macroscopic system.

"it is valid only approximately for finite systems"

Whether or not this (finite system) problem is of concern depends on what one is trying to do. Any individual who is serious about uniting the laws of black hole dynamics with the laws of thermodynamics is likely to leave no stone unturned. The statistical formulation of thermodynamics is incompatible with the general relativistic formulation of black hole dynamics. Nonetheless there are very good theoretical reasons for thinking that the two sets of laws are indeed united. See perhaps...

Bob Wald -- University of Chicago,

Andrew Strominger --Harvard."The fundamental laws of physics are both incomplete and contradictory"

"The theory of exact in the limit of infinite N"

Yes of course. I made this very point in my 1996 "Hawking's Error" article -- that I thought you said you had read. Remember my "absolute certainty principle for infinte systems?"

oleg said...

I don't think I can find that link anymore. Try another one at and set the temperature to 10000 to speed things up.

And I don't think we need to unite thermodynamics with general relativity. They should be consistent with one another, but they need not be united.

William Brookfield said...

John Horgan and Steven Hawking are both making a category error and the same error in my opinion.

The k-complexity of the sand on a beach is very high. To use an analogy, the information required to record all sand-grains specific positions is very large. The random motion of waves wind and tides -- like the random typing monkeys -- does not produce k-complexity of the sand-castle type ("Finnegans Wake" type) but only raw, go-nowhere, system-level (beach-level) k-complexity. To continue the analogy, a beach is a beach is a beach and wrecked sand-castle is a wreck is a wreck.

Steven Hawking, John Horgan and apparently the entire orthodox scientific community, persist in confusing this raw system level complexity (monkey "shakespeare") with Real Shakespeare, Joyce’s Finnegans Wake etc -- I.E., beach-complexity with sandcastle-complexity. The endless perpetuation of this particular error is necessary to protect Darwinism, which requires a confused, order-adulterated concept of "randomness" in order to “justify” its formulation. This of course, destroys progress in all areas of science (information theory, thermodynamics black hole dynamics) that are dependent (for any further progress) upon a fully rigorous definition of "randomness." I first publicly pointed to this error in 1996 -- so don't expect it to be corrected anytime soon. Darwin and Materialism must be worshiped and protected at all costs. It is the "End of Science" indeed when error must be kept and truth/reality avoided for the sake of “science.”

Oleg "I don't think we need to unite thermodynamics with general relativity."

Clearly my scientific ambitions are different than yours. I believe that unification should be sought. Without any progress in science (practical and theoretical), interest and funding dwindles The nature of my theoretical work requires unambiguous definitions. A scientific discussion of randomness cannot be had amongst those who worship randomness. A scientific discussion of Jesus cannot be had amongst those who worship Jesus. My work (at this time) does not require any contemplation of Jesus. It does however, require the contemplation of “randomness” and related concepts such as “order,” “information and “entropy.” Unlike orthodox scientists, ID scientists are capable of thinking rationally wrt randomness (though perhaps not wrt Jesus). In all cases I am looking for groups of non-worshiping scientists capable of thinking rationally.

dobson said...


Why is it that you think your theories have not been adopted by mainstream science?


William Brookfield said...

Hello (James) Dobson,

Welcome back and thank you for providing your picture, it is nice to see you :)

I have a number of theories and in many cases lack of acceptance can be explained as the result of lack of publication/explanation. That said there exists the real question as to why simple things such as the "slipperiness" of "randomness" (from this thread) is being so well maintained and why my calls for a strict (scientific) randomness-order-information continuum scale are being so effectively ignored.

Continuum models are a part of science. "Zero degrees Kelvin" refers to the absence of thermal energy. "Randomness" refers to the absence of order. "Randomness" also refers to the absence of information because "information" is a complex form of order. As far as I can tell the randomness-order-information continuum/scale is presently missing from science because orthodox science has lost its way (in this area) and is now being driven by an ideologically (anti-creationist) agenda. The problem, it seems to me, is that within materialist ideology "randomness/chance" is presently serving as a kind of creator replacement.

Science is about exploring the world/universe through the use of the scientific method -- it is not about stamping out creationism per se. The job of the scientist is to simply follow the evidence *wherever* it leads.

dobson said...

William, allow me to be blunt or even rude - would you forgive me for saying that I have grave doubts as to the validity of any of your propositions:

There are many "lone soldiers" in science who claim to have discovered astounding truths which contradict well-tested or highly-regarded and accepted theories in science. Mostly these people are what we call "crackpots" - their theories are no more than the uneducated ramblings of people who do not know the limits of their knowledge . Usually these people are wishful thinkers and fantasists who have become carried away with their own imagination.

Sometimes these lone-soldiers are actually geniuses whose independent research is later found to be entirely correct - their work is eventually validated and regarded as good science.

The question is, how can I tell which kind of person you are? At a conservative estimate I'd say that there are 10,000 times as many cranks than geniuses, even allowing for a somewhat fuzzy boundary between the two.

Do you think that there is some kind of method you could use that would validate your ideas?

What me

William Brookfield said...

HI Dobson,

The purpose of his thread is to discuss science and its present foundational definitions. Specifically I have proposed that one of our present definitions of "complexity" (Kolmogorov complexity) is a good general definition of "complexity."

You are free to question any (and indeed all) of my proposals if you wish. However I am looking for
specific counter arguments to my proposals.

The question is, how can I tell which kind of person you are? Do you think that there is some kind of method you could use that would validate your ideas?

This is the internet and of course there are "cranks" to beware of. As a basic I recomend the study of logic and/or "critical thinking." This will help one to catch any flaws/fallacies in logic that people might be using. A general study of the history and philosophy of science would be helpful too. Of course there is empirical testing and the knowledge that the scientific community has built up over many years of testing. My effort is to fit existing data into new models/theories that are logically sound and logically possible, while still being consistent with existing scientific data. I am attempting solve various puzzles regarding the true nature of reality. I am doing this, to a large extent out of my own interest/curiosity and I am not too concerned whether people believe me or not. I am not a preacher and no-one is required to believe what I propose. As a human being I can be mistaken.

You are not required to believe that Kolmogorov's definition is a good general definition of "complexity" but if you can explain why it is not, I would be interested to hear your argument.

oleg said...

What data are you talking about, William?

William Brookfield said...

Hi Olegt,

For twenty year Hulse and Taylor gathered *data* regarding a binary pulsar system (PSR 1913+16). They also had a theory (thanks to Einstein) that allowed them to make sense of this *data.* The data received was spectacularly consistent with the predictions of General Relativity regarding such a system. As a result any theorist (crank?) attempting to replace GR with a new theory must explain why GR worked so well in its domain -- just as the new GR explained why Newtonian mechanics worked so well in its domain. The data I refer to is the data gathered by countless scientists worldwide. Due respect must be given to this data and to this theoretical work.

oleg said...

How is their data fitting your model? And what is your model to begin with?

William Brookfield said...

Hi Oleg,

Einstein proposed that gravity is space-time curvature. I am proposing that gravity might be a highly consistent space-time erosion ("devolution"). Both of these models are consistent with the predictions of G-Relativity and Hulse and Taylor's data. "Erosion" however, suggests different solutions to Quantum Gravity and GSL unification than Einsteinian "curvature."

oleg said...


I have a vague recollection that Einstein's general relativity theory had some equations in it. The data of Hulse and Taylor have been checked against solutions to those equations. Quantitative agreement was found.

What are the equations of your theory?

William Brookfield said...

As a physicist and physics teacher you had better have more than just a vague recollection of Einstiens equations! My conclusion is that you are being sarchastic.

It is possible to offer different interpretations of QM without changing Schrodingers equation. As I have said before, I am working on an infodynamic theory of the cosmos -- IE, A theory that the cosmos is, at root, an information structure and that all material forms are subject to information loss ("devolution"). The first job in such an endevour is to examine *information* theory (Shannon) and information theories in general (such as Dembski's CSI) to ensure that they are robust and maximally free of confusion. In these terms "information" is proportional to the negative log of the probability of any specified event in a given phase space. The second job is to produce or find an informational (information loss/devolution) interpretation of the second law of thermodynamics. Apparently some "cranks"(?) are already working on this (HT DLH at Uncommon Descent) . The third job is to develop find a an information-loss theory of relativity. Given the famous debate about information loss in black holes (Hawking vs Preskill) GR may already be a theory of information loss (devolution) in disguise. The equations of GR automatically produced the Second Law of BH Dynamics - which some consider to be related to the Second Law of Thermodynamics -- which some of us in turn consider to be related to Information Dynamics (ID).

William Brookfield said...

Just a few clarifications regarding my use of terms. The term "evolution".. (from a one celled organism to humans or elephants etc.,) refers to a gradual increase in information over time. "Devolution" on the other hand, refers to a gradual loss of information."

"ID" can refer to both "Info-Dynamics" or "Intelligent Design." "Intelligent design" is the dominant theory of the "origin" or production of information in the larger field of of "Infodynamics." "Intelligent design" is therefore a subset of Info-Dynamics. "id" is a subset of "ID."

Wrt relativity there are two main problems as I see it.

#1. Thermodynamic & BH Dynamic unification; which I think requires a new interpretation but no fundamental change to GR (think of the many interpretations of QM). And #2. A quantum theory of gravity; which I believe requires a more fundamental change to relativity (a new mathematical appoach to "curvature"). I have not yet mathematically solved the quantum gravity problem.

I have some ideas as to a solution to QG (Problem #2) based upon what I presently see as the solution to problem #1.