Discussion about this post

User's avatar
Wyrd Smythe's avatar

I wondered why my ears were burning when I woke today. :D

As you point out so well, complexity is difficult to pin down. "I know it when I see it" seems as good an approach as any!

FWIW, I like the notion of Kolmogorov complexity (though as you point out, it doesn't always apply). The digits of pi appear (and essentially are) random (although they do have meaning as digits of pi). Yet programs to churn out as many digits of pi as desired are considerably shorter than that (infinite) string of digits. Perhaps because pi itself is a simple concept. A program to spit out random digits is small, but a program that spits out a *specific* string of random digits (say some specific transcendental number) probably ends up being at least as long as those digits.

I think the notion of file compression is very useful, too, and I see that as another example of Kolmogorov complexity. Compressed files (or encrypted ones) have a lot of apparent entropy, but the decompression algorithm (or the decryption) algorithm unlock that apparent randomness and produce meaning.

Just for fun, I compressed a text file I have of Hamlet. The text file has 188,622 bytes. It compresses to 70,310 bytes (37.28% of the original size). Then I whipped up some code to generate a text file with random characters with roughly the same profile (words and lines) as the Hamlet file. That file compressed to only 61.90% of the original size. I assume this is because English is so structured (lower entropy) whereas strings of random characters are not.

As another example, I have an image file, 1600×900 pixels where every pixel is random. (I made it to illustrate how much, or rather how little, data a 5¼" floppy held.) My compression app (7-zip) can't compress it and just stores it. Compression = 0%! The metadata makes the zip file *bigger* than the image file!

As an aside, your post reminded me again that I've been meaning to implement that 1D cellular automata algorithm so I can see for myself how those rules play out. And maybe make some animated videos. Thanks for the post and the reminder!

Expand full comment
Mike Smith's avatar

A very interesting discussion. I especially like the contrast between complexity and entropy.

Although I'm not sure there isn't a lot of complexity in a completely mixed cup of coffee, or in the heat death of the universe. The concept of apparent complexity, I think, gets at this. For our purposes, the completely mixed cup isn't complex, because its relevant causal effects are going to be simple. Same for the heat death of the universe, where causality peters out. But in both cases, if for some reason we wanted a thorough description of their state, it would require an enormous amount of information (an impossible amount in the case of heat death, at least within this universe).

Langton's "edge of chaos" gets at something I've played around with before. A name I've played around with for complex functional systems: biological, computational, etc., is "entropy transformers", systems that find a way to exist in a high degree of entropy, but are able to take in energy and transform it anyway.

Excellent post, as always Suzi!

Expand full comment
51 more comments...

No posts