Thermodynamics and Entropy

We all know that there are certain things that just can’t happen in the real world. For example, broken teacups don’t just reassemble themselves. If you saw this happen in a movie, you’d know that this impossible feat had been accomplished by running the film backwards.

In physics class, we often talk about different laws of conservation. These tell us how systems behave in the physical universe. One of the laws which we look at over and over again, at the microscopic and macroscopic levels, is conservation of energy. For mechanical systems, we talk about the total energy of a system as being composed of its kinetic and potential energies.

Today, we’re going to extend these ideas to thermodynamics, the study of processes where energy is transferred as work and as heat. We’ll talk about a system (the thing(s) in which we’re interested) and its external environment.

First Law of Thermodynamics: When energy enters or leaves a system (e.g. as heat, work, etc.), the internal energy of the system changes according to the law of conservation of energy.

In other words, the change in internal energy U of a system is equal to the heat Q added to the system minus the work W done by the system:

Change in U = Q – W.

Here we think of heat as the transfer of energy due to a difference in temperature, while work, also a transfer of energy, is not due to temperature difference. So suppose I had a system, and I added 2000 Joules of heat to it, and did 1000 Joules of work on it. Then the change in internal energy of the system is:

Change in U = 2000 – (-1000) = 3000 Joules.

This example is trivial, but it gives us the flavor of the first law. A shorthand way of remembering the first law is “you can’t win”, meaning the energy of a system is conserved no matter what. Energy can be neither created nor destroyed.

Where we really want to get to today is the second law, which explains why some things happen in nature and others do not. For example, if you put a hot object in contact with a cold object, heat will always flow from the hot object to the cold object, never from the cold object to a hot object (there are some very deep ideas here about reversible and irreversible processes, which we’re glossing over for now). The second law of thermodynamics formalizes this:

Second Law of Thermodynamics (original formulation): Heat can flow spontaneously from a hot object to a cold one, but never the reverse.

Second Law of Thermodynamics (generalized): The total disorder or entropy of a system and its environment increases as a result of any natural processes.

At this point, you’re probably wondering how we got from the first formulation of the second law to the second, so we’ll look at that more carefully now.

In the case of the hot and cold objects, we begin with a highly ordered system having two classes, hot and cold. The molecules of the hot object have an average higher kinetic energy than that of the molecules in the cold object. When placed in contact, the hot object transfers heat to the cold object, never the reverse, and the entire system reaches a state of disorder in which no work can be done. We call the measure of the system’s disorder its entropy.

A shorthand way of thinking of the second law is “you can’t even break even”. Entropy never decreases. It can remain the same for reversible processes, but otherwise it increases.

Therefore, we can conclude that the term entropy has a lot of complex ideas associated with it. It can be used to describe the amount of energy unavailable to do any work. It can also be used to describe the number of arrangements that atoms can have in a system, which of course leads us to the idea that it is associated with probability measures associated with randomness.

Through our empirical experiences, we all know that there are irreversible processes in our world, like the breaking of a teacup. Entropy gives us a useful way of thinking about these. Physicists and engineers use entropy to make models of the universe, and to create efficient car engines and power plants. Entropy is, in fact, an essential feature in the design of modern communication systems, a topic to which we turn next.