<$BlogRSDUrl$>

Saturday, November 12, 2011

One of those AHA moments (hope this doesn't turn out to be too embarrassing) 

Entropy, information and evolution
Entropy and information
Entropy and information both attempt to take the qualitative concepts of variety and constraint and provide a quantitative measure, using mathematical concepts. Thermodynamic entropy has the dimension of energy divided by temperature, and a unit of joules per kelvin (J/K). In information theory, entropy and information are measured in bits, where the number of bits is just the number of yes/no questions, usually encoded as a zero or one, needed to determine the content of a message.

The translation between thermodynamic entropy and information entropy was developed in a series of papers by Edwin Jaynes beginning in 1957. The arrow of time is defined in term of an increase in entropy. When this thermodynamic entropy is translated into bits, the arrow of time can be defined as a loss of information. However this implies a digital universe.

Example – a digital photographic image
To show how all of this works, I will give an example of a digitally encoded image as a kind of state vector. An uncompressed digital photograph has a lot of redundant structure that increases the file size used to store the information needed to reproduce the image. A fully compressed image has no structure in its bits and represents the smallest length of bits that can reproduce the image. The size of the compressed file is the number of bits of “syntactic information”. This is the smallest description that can restore the image in all its detail, given a decompression algorithm.

As you lose bits of information from the compressed image, the photograph degrades providing an arrow of time. The number of bits of “syntactic information” provides an absolute clock for a closed system.

Example - an image of an alphabetic character and meaning
If you can accept lossy compression, you can compress the image further. An image of an alphabetic character or a letter of the DNA code can be compressing further to only leave what is needed to recognize the character. Information about the pattern of the background of the character is lost during compression and decompression. This “semantic” or meaningful information is linked to a purpose (recognizing a character) and therefore is linked to an observer/consciousness or at least the interpreter of a control system. This meaningful information is sometimes referred to as distinctions that make a difference.

The universe as an decompressing image
At the beginning of the universe, the universe had low entropy and uniform structure typical of a compressed image. If time is defined as loss of information, where does the structure of the universe come from? Structure comes from gradually decompressing the “image” of the universe using a decompression algorithm (the laws of nature). As the universe decompresses, its semantic information or meaning/purpose is revealed as unfolding or emergent structure.

The dynamics of an evolving system
In a realistic dynamic system, states would not need to be explicitly ordered by a time variable, but the information in a state vector would naturally provide its order (time) within a collection of states. It would have a built in clock. If you have a photograph that had been copied over and over, with each copy having a little less information, you could reconstruct the sequence of each copy by looking at how washed out the image looked.

The dynamics of an evolutionary system would then involve mapping a state vector into another state vector of increasing size. If you translated the state vectors to a base 2 binary vector, this would mean mapping a state to a future state of ever increasing dimensionality. But while the state vector is increasing in dimensionality, the compression or information in the state vector is decreasing in dimensionality. As long as the lost syntactic information didn’t cut into the semantic information of the system, this would result in the spontaneous emergence of meaningful structure within the system. This loss of non-semantic information could be thought of in terms of evolutionary selection. However the source of that information would not be random variation, but an information rich initial state linked to consciousness or controller outside the boundaries of the system (i.e. space/time).

Topics: evolution| information | entropy

(0) comments

This page is powered by Blogger. Isn't yours?