Entropy

This post is also available in: Spanish

Entropy

There are many definitions of entropy, even though they are all partially equivalent in a rough sense.

Entropy as a concept was developed by Clausis in the 19th century, in an attempt to understand the behavior of machines. Clausius defined entropy as a measure of the energy in a system unable to do work and then postulated that it will always increase with time in an isolated system.

What this means, from a practical point of view, is that one will always lose some useful energy in an engine. For example, in a car engine, some of the energy from burning the petrol will be used for powering the engine, and this energy can be used again. However, some of it will inevitably heat up the engine, and it is this energy which is impossible to recover.

Even though this definition is widely used in Chemistry and Physics, there is a new definition, suggested by Boltzmann also in the 19th century, which gives the concept a more intuitive twist. It is from this definition of entropy that the usual way of referring to it as “disorder” come from.

Boltzmann’s definition states that entropy is the number of microscopic states compatible with a certain macroscopic state. As in Clausius’s entropy, Boltzmann’s also increases over time. However, in this case it is easy to see why.

Let’s suppose we have a gas, contained in a very small volume. We are interested in knowing how many possible configurations of molecules are compatible with us measuring the same volume, temperature and pressure. The answer will, of course, be quite high. For example, take every single molecule and swap it by its neighbor, then again until one runs out of molecules.

However, if we make the volume much larger, it’s easy to see the number of molecule configurations compatible with the macroscopic state gets much larger. Why? Because now the available volume to swap and move molecules around has increased substantially.

If we have a gas in a container, then, we should expect it to expand to the whole volume of the container. There is nothing in the laws of Newton that demands this: the molecules could just happen to be all in a corner. However, there are so many more possibilities for the molecules being spread over the whole volume that, in practice, it makes the chance of finding all the molecules in a corner almost nil.

In this view, the second law of thermodynamics can be deduced from simple statistics: all it says is that more likely things tend to happen, given enough time.

There is yet one more definition of entropy which does not seem related to physical entropy, but which can actually be shown to be. This is the so-called Shannon Entropy.

Shannon entropy is a mathematical concept. It is a measure of the information in a system, and is thus measured in bits.

Let’s suppose we have a string of bits. If the string is totally random, there is no way we could know whether the next bit will be 0 or 1. Therefore, before reading the bit, we have no information and, after doing so, we will have 1 bit of information. In this case, each bit in the string is said to have an entropy of 1.

Now let’s imagine the opposite case: we have a string of bits, but it’s just a chain of 1s. In this case, even before we see the next bit, we know it will be a 1. There’s no uncertainty in the next bit, therefore it can give us no information. Its Shannon entropy is 0.

There are many cases in between. Most computer files, for example, exhibit some kind of regularity, but not enough that we can know each bit before we see it. Therefore, their Shannon entropy is higher than 0, but typically less than the file length in bits. We can see the Shannon entropy of a string as the minimum length in bits to which it may be compressed.

This definition may seem detached from Physics, but actually the concept of physical entropy is very close to information. For example, given a gas in a big volume, the randomness -and therefore the uncertainty- in the position of each individual molecule is much greater: therefore, specifying the position of each one requires a lot more information than it would for a smaller volume.

The new developments in black hole thermodynamics have led to amazing discoveries in theoretical physics, the pinnacle of which is the holographic principle. The link between entropy and information was a crucial step in the process.

For more information about entropy, visit myoutsourcedbrain.com.

Share and Enjoy:
  • Digg
  • del.icio.us
  • Facebook
  • Yahoo! Buzz
  • Twitter
  • Google Bookmarks
  • Bitacoras.com
  • Google Buzz
  • Meneame
  • Reddit
  • RSS

Leave a Reply

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>