Wednesday, August 31, 2005

The meaning of Entropy 

Jean-Bernard Brissaud
Clarifying the meaning of entropy led us to distinguish two points of view: the external one, which is the one of the observer of the studied system, and the internal one, which is the one of the system itself.

The external point of view leads to largely admitted associations: entropy as lack of information, or indetermination about the microscopic state of the studied system.

The internal point of view, the one we should have if we were the studied system, leads to interpretations more rarely seen and yet useful. Entropy is seen as a measure of information or freedom of choice.

These two analogies fit well together, and are tied by the duality of their common unit: the bit. A bit of information represents one possibility out of two, a bit of freedom represents one choice out of two.

. . .

Entropy is often assimilated to disorder, and this conception seems to us inappropriate. Instead, temperature is a good measure of disorder since it measures molecular agitation, the part of the motion which doesn't contribute to a possible global motion.

To assimilate entropy with disorder leads to another, unwise, definition of order, as absence of freedom, since entropy measures freedom.

Topics: Entropy | Information | Meaning

Links to this post:


Comments: Post a Comment

This page is powered by Blogger. Isn't yours?