Entropy (Thermodynamics): Definition, Formula & Examples
You are likely familiar with the idea that heat always seems to flow from hot objects to cold objects, and not the other way around. Also, after mixing two things together, they aren't likely to unmix as you keep stirring.
A broken teacup will not spontaneously reassemble itself, and milk spilled out of the bottle will not easily be recovered. The reason behind all of these phenomena has to do with the second law of thermodynamics and a concept called entropy.
To best understand entropy, you must first know some of the fundamental concepts of statistical mechanics: microstates and macrostates.
Microstates and Macrostates
In statistical mechanics, a microstate is one possible arrangement (and thermal energy or internal energy distribution, if applicable) of the particles in a closed system that may occur with some probability.
One of the simplest examples of this is with a set of two-sided coins, which may be either heads or tails. If there are two identical coins, there are four possible microstates of the system: coin 1 is heads and coin 2 is tails, coin 1 is tails and coin 2 is heads, both coins are heads, and both coins are tails.
If the coins are constantly being simultaneously flipped (similar to the molecules in a gas constantly moving around), each microstate can be considered a possible "snapshot" of the system at a single point in time, with each microstate having a certain probability of occurring. In this case, the probability of all four of these microstates is equal.
As another example, imagine a brief snapshot of the molecules of gas in a balloon: their energies, their locations, their velocities, all taken at one single instant. This is a possible microstate of this particular system.
A macrostate is the set of all possible microstates of a system, given state variables. State variables are variables that describe the overall state of the system, regardless of how it got to that state from another (either by different arrangements of molecules, or different possible paths taken by a particle to get from an initial state to a final state).
For the balloon, possible state variables are the thermodynamic quantity temperature, pressure or volume. A macrostate of the balloon is the set of every possible instantaneous picture of the gas molecules that could result in the same temperature, pressure and volume for the balloon.
In the case of the two coins, there are three possible macrostates: One where one coin is heads and one is tails, one where both are heads, and one where both are tails.
Notice that the first macrostate contains within it two microstates: coin 1 heads with coin 2 tails, and coin 1 tails with coin 2 heads. These microstates are essentially different possible arrangements of the same macrostate (one coin heads and one coin tails). They are different ways to get the same state variable, where the state variable is the total number of heads and the total number of tails.
The number of possible microstates in a macrostate is called that macrostate's multiplicity. For systems with millions or billions or more particles, such as the gas molecules in a balloon, it seems clear that the number of possible microstates in a given macrostate, or the multiplicity of the macrostate, is unmanageably large.
This is the usefulness of a macrostate, and is why macrostates are generally what is worked with in a thermodynamic system. But microstates are important to understand for entropy.
Definition of Entropy
The concept of entropy of a system is directly related to the number of possible microstates in a system. It is defined by the formula S = k*ln(Ω) where Ω is the number of microstates in the system, k is the Boltzmann constant, and ln is the natural logarithm.
This equation, as well as a great deal of the field of statistical mechanics, was created by the German physicist Ludwig Boltzmann. Notably, his theories, which supposed that gases were statistical systems due to being made up of a large number of atoms or molecules, came at a time when it was still controversial whether or not atoms even existed. The equation
\(S=k\ln{\Omega}\)
is engraved on his tombstone.
The change in entropy of a system as it moves from one macrostate to another can be described in terms of state variables:
\(\Delta S=\frac{dQ}{T}\)
where T is the temperature in kelvin and dQ is the heat in Joules exchanged in a reversible process as the system changes between states.
The Second Law of Thermodynamics
Entropy can be thought of as a measure of disorder or the randomness of a system. The more possible microstates, the larger the entropy. More microstates essentially means there are more possible ways of arranging all of the molecules in the system that look pretty much equivalent on a larger scale.
Think of the example of trying to unmix something that's been mixed together. There are an absurd number of microstates in which the materials remain mixed, but only very, very few in which they are perfectly unmixed. Therefore, the probability of another stir causing everything to unmix is vanishingly small. That unmixed microstate is only realized if you go backward in time.
One of the most important laws of thermodynamics, the second law, states that the total entropy of the universe (or of any perfectly isolated system) never decreases. That is, entropy increases or stays the same. This concept, that systems always tend toward disorder over time, is also sometimes called Time's Arrow: it only points in one direction. It is said that this law points to the eventual heat death of the universe.
Work and Heat Engines
A heat engine uses the concept of heat moving from hot objects to cold objects to create useful work. An example of this is the steam locomotive. As fuel is burned, creating heat, that heat moves into water, which creates steam, which pushes pistons to create mechanical motion. Not all of the heat created by the fuel fire goes into moving the pistons; the rest goes into heating the air. Internal combustion engines are also examples of heat engines.
In any engine, as work is done, the entropy given to the environment must be more than the entropy taken from it, making the net change in entropy negative.
This is known as the Clausius inequality:
\(\oint\frac{dQ}{T}\leq 0\)
The integral is over one complete cycle of the engine. It is equal to 0 in a Carnot cycle, or a theoretical ideal engine cycle where the net entropy of the engine and its surroundings neither increases nor decreases. Because entropy does not decrease, this engine cycle is reversible. It would be irreversible if the entropy decreased because of the second law of thermodynamics.
Maxwell's Demon
The physicist James Clerk Maxwell created a thought experiment involving entropy that he thought would further understanding of the second law of thermodynamics. In the thought experiment, there are two containers of gas of the same temperature with a wall between them.
A "demon" (though this was not Maxwell's word) has nearly omnipresent power: He opens a small door in the wall to let fast-moving molecules move from box 1 to box 2 but closes it for slower-moving molecules. He also does the inverse, opening a small door to allow slow-moving molecules from box 2 into box 1.
Eventually, box 1 will have more fast-moving molecules and box 2 will have more slow-moving molecules, and the net entropy of the system will have decreased in a violation of the second law of thermodynamics.
Cite This Article
MLA
Fore, Meredith. "Entropy (Thermodynamics): Definition, Formula & Examples" sciencing.com, https://www.sciencing.com/entropy-thermodynamics-definition-formula-examples-13722777/. 28 December 2020.
APA
Fore, Meredith. (2020, December 28). Entropy (Thermodynamics): Definition, Formula & Examples. sciencing.com. Retrieved from https://www.sciencing.com/entropy-thermodynamics-definition-formula-examples-13722777/
Chicago
Fore, Meredith. Entropy (Thermodynamics): Definition, Formula & Examples last modified March 24, 2022. https://www.sciencing.com/entropy-thermodynamics-definition-formula-examples-13722777/