Critical Questions:
- What is entropy?
In order to gain a full appreciation for energy, the last idea you’ll need to understand is something called entropy. It’s kind of a tough one, but I guarantee it’ll be worth it if you persevere to the end of this chapter!
The reason I’ve saved it for last is that many people find entropy quite difficult to understand. There are two reasons for this: one is that it’s another purely abstract concept; the other is that there are about twenty different ways to define entropy, and they all seem very different from each other.
Luckily for us, however, there is one easy way to understand entropy that we can explain with a simple example.
Take a large cardboard box, a can of red paint, and a can of blue paint.[1. Note: the actual colours are not important. Let your aesthetic sense be your guide.] We’re going to paint the inside of the box in these two colours: red for the left-hand side, and blue for the right.
Once the paint has dried, grab 8 ping pong balls again[2. This is no coincidence. I will not hide the fact that I like ping pong balls.] and put them into the box.
What we’re going to do now is grab the box, close our eyes (mostly for effect), and shake the box around so that the balls bounce around inside. Then, still with our eyes closed, we’re going to put the box back down onto a flat surface.
When you look at the box again, what do you expect to see? In real life, the ping pong balls will probably have rolled over to one side, because it is difficult to keep the box completely flat while you’re shaking it around. But if the box had been kept flat, and the balls had had a chance to roll around a bit, we should probably expect to see about half of them on the red side and the rest on the blue.
Why is this? Unfortunately, we’ll need to use a bit of statistical math to answer that question.
First, let’s figure out how likely it is to find one ball on blue and seven on red (let’s call this a “7/1” state). There are 8 balls, and any of them could be the one on the blue side, so we could say that there are 8 possible configurations that result in a 7/1 state.[3. People with a knowledge of statistical mathematics will know that I’m describing these ping pong balls as distinguishable, whereas many particles can be considered indistinguishable. However, I think keeping the balls distinguishable will make the example easier to understand for people who don’t feel comfortable with the math.]
Now we can consider a 4/4 state. If we imagine the balls as being labelled ‘A’ through ‘H’, one arrangement would have A, B, C, D on red and E, F, G, H on blue. Or we could have A, B, C, E on red and D, F, G, H on blue. Or A, B, C, F on red and D, E, G, H on blue…
You may already have guessed that there are many possible combinations here. In fact, the total number of 4/4 combinations is 70 – quite a bit more than the number of 7/1 combinations.
I’ll do the rest of the calculations for you and put them in a little chart for you to look at below. (One I haven’t mentioned yet, but which might be obvious, is that there is only one possible configuration for the 8/0 state – i.e., all of the balls on the red side and none on the blue.)
State | Number of Configurations |
---|---|
8/0 | 1 |
7/1 | 8 |
6/2 | 28 |
5/3 | 56 |
4/4 | 70 |
Total: | 162 |
If the box is indeed completely flat and the ping-pong balls have no reason to be on either side of the box other than pure chance, then any one configuration is just as likely as another. So looking at the box after shaking it is essentially the same as writing every possible configuration on a piece of paper, putting all of the papers in a hat, and pulling one out.
What are the chances that you’ll pull out a configuration matching an 8/0 or 7/1 state? Pretty low. The hat is stuffed with papers matching more evenly distributed states. And almost half of the papers are 4/4 configurations.
This means – and this is a rather important point for both thermodynamics and quantum mechanics – that the probability of finding the box in a particular state is proportional to the number of possible configurations of that state.
We can now define entropy: it is a measure of how likely a particular state is. And this matches up with the number of possible configurations that would result in that state. The 4/4 state has a very high entropy, while the 8/0 and 7/1 states have low entropy.
This example is a simplified way of thinking about something like the distribution of air molecules in a room. But the numbers here are very different – a regular-sized room would have something like 1021 air molecules in it (that’s a 1 with 21 zeroes after it) (which is quite a lot). So we have to increase our numbers: for example, with 10 ping pong balls, the 5/5 state has 252 configurations. For 20 balls, the 10/10 state has 184 756 configurations.
As you can see, these numbers are getting quite big. In fact, I don’t have access to a computer powerful enough to be able to calculate the number of configurations of the half/half state when 1021 ping pong balls are involved. Suffice it to say that it is a ridiculously large number. And meanwhile, for any number of balls, the number of configurations with all of the particles on one side of the box is still 1. The upshot of this is that the chances of all of the air molecules in a room spontaneously squishing together on only one side of the room (or 90% of the particles, or even 80%) is mind-bogglingly small – so small that it will almost certainly never happen.
Another thing this example demonstrates is that states in which things are more spread out (rather than clustered) tend to have higher probability and thus higher entropy. This matches up perfectly with what we know of real life. If you dump a pile of bricks on the ground, you wouldn’t expect them to land stacked up in a perfectly straight brick wall: instead, they’ll spread out in a big pile. One way to explain why this happens is that there are many more configurations of bricks that are scattered around randomly than stacked neatly, so the spread-out state has more entropy, and so it is more likely.
I hope you’re getting a glimpse into the subtle power of this idea of entropy. In fact, this kind of statistical work forms the basis for thermodynamics, which in turn forms the basis for much of quantum physics. In the next section, we’ll talk more about the implications of the high probability of high entropy, and why it’s got us stuck in a world full of inefficiency.
Big Ideas:
- Entropy is a measure of the statistical probability of a state.
Next: 5.7 – Entropy, Part 2: Efficiency
Previous: 5.5 – Heat and Temperature