Imagine you have a small box filled with equal numbers of black and white beads. When you first get the box, all the white beads are arranged in a layer on the bottom and all of the black beads are on the top.

As soon as you start shaking it, though, this neat, ordered state is completely broken, and they quickly become mixed. Because there are so many specific ways the beads can be arranged, it’s almost impossible that by continuing the random shaking process, you’d end up with the beads back in their original order.

The physical explanation for this comes down to the second law of thermodynamics, one of the most important laws in all of physics. To understand the details of this law, you’ll need to learn the basics of microstates and macrostates.

## What Is a Microstate?

A microstate is one possible arrangement of the distribution of energy of all of the molecules in a closed system. In the bead example above, a microstate would tell you the precise positions of all of the individual black and white beads, so you *completely* knew about the state of the whole system, including the momentum or kinetic energy of each of the beads, too (if there was motion).

Even for small systems, you need quite a lot of specific information to really specify the microstate. For example, for six identical particles with nine units of energy distributed between them, there are 26 microstates for systems with identical particles (e.g., one where a particle has 9 energy, one where a particle has 8 and another has 1, one where one has 7 and two have 1 and so on). For systems with distinguishable particles (so it matters which specific particle is in which specific location), this number increases to 2002.

It’s clear, though, that this level of information about a system is hard to obtain, and this is why physicists either depend on macrostates or use approaches like statistical mechanics to describe the system without the huge information requirement. These approaches essentially “average out” the behavior of large numbers of molecules, describing the system in less precise terms, but in just as useful a way for real-world problems.

## Arranging Gas Molecules in a Container

Suppose you have a container of gas that contains *N* molecules, where *N* is probably a very large number. Just like the beads in the example from the introduction, there are a huge number of places a molecule can occupy inside the container, and the number of different energy states for the molecule is very big too. Based on the definition of a microstate given above, it should be clear that the number of possible microstates inside the container is very large too.

But how large is the number of these small states or microstates? For one mole of gas at a temperature of 1 to 4 Kelvin, there are a massive 10^{26,000,000,000,000,000,000} possible microstates. The size of this number is really hard to overstate: By comparison, there are about 10^{80} atoms in the entire universe. For liquid water at 273 K (i.e., 0 degrees Celsius), there are 10^{1,991,000,000,000,000,000,000,000} accessible microstates – to write a number like this out, you’d need a pile of paper *light years* high.

But this isn’t the whole problem with looking at a situation in terms of the microstate or possible microstates. The system spontaneously changes from one microstate to another, at random and pretty much continuously, compounding the challenges of producing a meaningful description in these terms.

## What Is a Macrostate?

A macrostate is the set of all possible microstates of a system. These are much easier to deal with than different microstates because you can describe the whole system with just a few macroscopic quantities rather than having to determine the total energy and precise position of all of the constituent molecules.

For the same situation where you have a large number *N* of molecules in a box, the macrostate can be defined with comparatively simple and easy-to-measure quantities such as the pressure, temperature and volume, as well as the total energy of the system. This is clearly a much simpler way to characterize a system than looking at the individual molecules, and you can still use this information to predict the behavior of a system.

There is also a famous postulate – the postulate of equal *a priori* probabilities – that states that a system has an equal probability of being in any microstate that is consistent with the current macrostate. This isn’t *strictly* true, but it’s accurate enough that it works well for many situations, and it can be a useful tool when considering the likelihood of microstates for a system given a specific macrostate.

## What Is the Significance of Microstates, Then?

Considering how complicated it is to measure or otherwise determine a microstate for a given system, you might wonder why microstates are even a useful concept for physicists. Microstates do have some important uses as a concept, however, and in particular, they are a key part of the definition of the *entropy* of a system.

Let’s call the total number of microstates for a given macrostate *Y*. When a system undergoes a change due to a thermodynamic process – such as isothermal expansion, for instance – the value of *Y* changes alongside it. This change can be used to gain information about the system and how much the change in state affected it. The second law of thermodynamics limits how *Y* can change, unless something outside the system interacts with it.

## Entropy and the Second Law of Thermodynamics

The second law of thermodynamics states that the total entropy of an isolated system (also called a closed system) never decreases, and in fact tends to increase over time. This is a much misunderstood law of physics, though, particularly because of the definition of entropy and the nature of something being a “closed” or isolated system.

The simplest part of this is what it means to say something is a closed system. This simply means that the system doesn’t interchange any energy with the surrounding environment, and so it is essentially “isolated” from the surrounding universe.

The definition of entropy is best given mathematically, where entropy is given the symbol *S*, *Y* is used for the number of microstates and *k* is Boltzmann’s constant (*k* = 1.38 × 10^{−}^{23} J K^{−}^{1}). Entropy is then defined by:

This tells you that the entropy depends on the natural logarithm of the number of microstates in the system, and so that systems with more possible microstates have higher entropy. You can understand what the law means if you think about it in these terms.

In the bead example from the introduction, the initial state of the system (a layer of white beads at the bottom with a layer of black ones on top) is very low entropy, because very few microstates would exist for this macrostate (e.g., where the beads are ordered by color).

In contrast, the state later, when the beads have been mixed, corresponds to a higher entropy because there *loads* of microstates that would reproduce the macrostate (i.e., “mixed” beads). This is why the concept of entropy is often termed a measure of “disorder,” but in any case, it should make intuitive sense that in a closed system, the beads will only *increase* in entropy but never decrease.

References

- Stanford University: The Statistical Description of Physical Systems
- University of Texas: Numbers of Microstates
- Lund University: Statistical Mechanics
- Chemistry LibreTexts: Entropy Rules
- Georgia State University HyperPhysics: Maxwell-Boltzmann Distribution Example
- University of Manchester: Microstates and Macrostates
- University of Texas: Macrostates and Microstates
- Utah State University: Microstates and Macrostates
- Chemistry LibreTexts: Microstates

About the Author

Lee Johnson is a freelance writer and science enthusiast, with a passion for distilling complex concepts into simple, digestible language. He's written about science for several websites including eHow UK and WiseGeek, mainly covering physics and astronomy. He was also a science blogger for Elements Behavioral Health's blog network for five years. He studied physics at the Open University and graduated in 2018.