Critical Questions:
- Why can’t we build a machine with 100% efficiency?
After a shamefully long delay, let’s take a look at the consequences of entropy. In the previous section, I described entropy as a measure of the statistical probability of a state.
One of the most significant results of this kind of thinking is that, because high-entropy states are more likely than low-entropy ones, the total entropy of the universe will always tend to increase over time. We have to say “tend to” here because things like all of the air particles in a room jumping to one side can, technically, happen. But if every process obeys this statistical reasoning, then instances of spontaneous entropy decrease are so unlikely that they are essentially impossible, so the total entropy of everything will always be naturally increasing.
Here’s where we can connect things back to energy. First of all, if you want to reduce the amount of entropy in a certain area – like arranging the bricks into a wall – you have to expend some energy.
The second connection between energy and entropy involves “useful” energy. Imagine a bunch of air particles concentrated on one side of a room (a low-entropy state). They will start bumping into each other and (obeying the law of increasing entropy) spread out around the room. At the beginning (the low-entropy state), as they all move from one side to the other, they could push a fan, and that turning motion could be used to power a light bulb. But when the air is spread around the room (a high-entropy state), the random motion of the air particles would push the fan blades in both directions at once, and the fan wouldn’t turn at all.
What this shows us is that low-entropy states have energy we can use, while high-entropy states don’t. In this sense, entropy can also be thought of as a measure of the amount of ‘unusable’ energy of a state.
This is why the production of heat counts as an increase in entropy. Heat is “unusable” energy because it always spontaneously flows from hot things to cold things. If you wind up a spring and leave it for a few days, you can still come back and get your elastic potential energy out of it. But if you store some energy as the vibrations of particles inside a hot coffee cup and then leave the room for a bit, it’ll be spread out uselessly around the room when you come back.
The effects of increasing entropy are unavoidable, and one important result of this fact is that in everything we do, there will always be some energy lost to entropy. For example, when an engine is running, it will always experience some friction in its mechanisms. Friction involves collisions between particles (as I’ve already mentioned), and so the particles involved tend to heat up, much like what happened with the bucket of warm water in a cool room. So the engine is only trying to push some pistons, but along the way it also transfers some heat energy into the metal, which then spreads out into the air around the engine. In other words, it ‘loses’ some of its energy in the form of heat.
We try to reduce the amount of friction in the engine using lubricants, but we can never avoid it entirely. Again, heat production is an example of an increase in entropy, which always tends to happen. And in any system we develop which uses energy to do some kind of work, there will always be some similar kind of entropy increase. Thus, no process can ever be 100% efficient.
I’d like to end this section with a short parable about energy use and entropy. Imagine that you live in an imaginary world where money can’t always buy you the things you need. And you want to build a platform that can lift people up a short cliff – in other words, an outdoor elevator. Your first idea is to build a simple pulley system and have your friends at the top of the cliff pull the ropes. This works fine for a few minutes, but it’s a popular cliff, and pretty soon you find that you don’t have enough food to feed your friends so that they’ll have enough energy to pull the ropes. In order to feed them, you’d have to plant fields, harvest grains, and then cook meals every day.
Luckily, you come up with a way around this problem: you tell your friends to tie big, heavy boulders to the other end of the rope. The boulders will fall, and the elevator will go up. Easy! What you have done is converted the gravitational potential energy of the boulders into kinetic energy as the system moved. Then, that kinetic energy turned back into gravitational potential energy for the elevator and its riders. In a looser sense, the low-entropy rocks clustered at the top of the cliffs provided useful energy when they were allowed to spread out along the cliff floor.
Of course, your friends had to work pretty hard to get those boulders in place, and pretty soon they’ll find that there aren’t enough boulders nearby to work the elevator. So you decide to build a truck to bring boulders from far away to your cliff.[1. If you were a bit brighter, you might realize that once you had a truck, or even just a gasoline engine, you could just use that to pull the rope up.The point of the story still holds, but in the meantime I’ll ask you to smarten up a bit.] This seems like a wonderful idea until you realize that this would require mining metals from the earth, melting them into the shapes you need, harvesting rubber from rubber trees in South America for tires, getting oil out of the ground to put into your engine…
This simple analogy sounds a lot different from how our world works, of course, but the basic idea is the same. As we know from the previous sections, there is no such thing as “free” energy – even a system designed to make use of natural things like falling rocks will eventually run out of rocks. And according to our ideas about entropy, it also seems that every time we develop a new system to do some work for us, it’ll always lose some energy, usually in the form of heat. Whenever we design a mechanism to ‘save’ us energy, it’ll also be losing energy, which we’ll only be forced to make up for in other ways.
Big Ideas:
- Entropy is:
- a measure of the statistical probability of a state.
- a measure of the amount of ‘unusable’ energy of a state.
- a measure of how ‘spread out’ the energy of a state is.
- High-entropy states (which tend to involve energy being more ‘spread out’) are much more likely than low-entropy states.
Next: 6.1 – Introduction to Waves and Sound
Previous: 5.6 – Entropy, Part 1: What Is It?