1
$\begingroup$

Entropy tells us about the "uncertainty" of a probability distribution, i.e. roughly how much information is needed to describe an event that is described by a probability distribution. With this view it is natural that the uniform distribution is the one that maximizes the entropy function. This is the information theory point of view of entropy.

On the other hand, in physics textbooks the second law of thermodynamics says that a system tends towards a a macrostate that maximizes the entropy. For a finite system this means choosing a macrostate whose corresponding microstates have the greatest multiplicity.

These two viewpoints should be in agreement, but I'm confused as they almost seem to be in contradiction with one another. For example, in physics the multiplicity graphs usually have a very sharp and localized peak, which is almost as far as a uniform distribution as you can get. How does the physics view of entropy agree with the idea of getting as closed to a uniform distribution as possible?

$\endgroup$
9
  • 1
    $\begingroup$Well, the MaxEnt principle tells you that equilibrium probability distribution is "as uniform" as possible, given the constraints you employ (e.g. fixed average energy for the canonical ensemble). There is no contradiction or so.$\endgroup$CommentedDec 12, 2024 at 9:31
  • 1
    $\begingroup$The reason is the information theoretic approach (=MaxEnt principle) applied to statistical mechanics. The system does not "want" anything. You seek a proper description of the situation at hand! The probability distribution assigns probabilities to certain microstates, yes, under macroscopic constraints. For example, putting no constraints (other than being a probability distribution) yields the microcanonical ensemble; fixing the average energy yields the canonical etc... Let me recommend Jaynes' "Information Theory and Statistical Mechanics" I and II.$\endgroup$CommentedDec 12, 2024 at 20:31
  • 1
    $\begingroup$No, Stat. Mech. does not assume this (and any text stating so is extremely sloppy). The MaxEnt principle is the "natural" generalization of Laplace's indifference argument you mention. Constraints obviously (!) can change probability distribution from uniform (no constraints) to non-uniform. Just think about it a second...I can just re-iterate: The MaxEnt principle yields, (very) roughly speaking, the probability distribution which is as uniform as possible while still obeying the constraints. After reading Jaynes' work (also other papers), you will understand :) his papers are gold.$\endgroup$CommentedDec 12, 2024 at 20:42
  • 1
    $\begingroup$@TobiasFünke In Schroeder's An Introduction to Thermal Physics he says: "In an isolated system in thermal equilibrium, all accessible microstates are equally probable". This is what was throwing me off, but I was overlooking the "in an isolated system" part which implies the microcanonical ensemble. Now I see where I was going wrong. I'm still not sure I fully appreciate why nature chooses chooses the distribution that is as uniform as possible, but I'll continue studying and reading the references you gave for that. Thanks for all the help!$\endgroup$
    – CBBAM
    CommentedDec 12, 2024 at 20:48
  • 2
    $\begingroup$Yes, indeed. And again: it is not nature which chooses anything. It is you (or we), who want to "guess" suitable probability distributions, given some information (e.g. values of macroscopic quantities, and a certain knowledge about the underlying physics). From the information theoretic perspective, stat. mech is an inference problem. Good luck with your further studies, and enjoy the Jaynes papers :). See you around.$\endgroup$CommentedDec 12, 2024 at 20:52

1 Answer 1

1
$\begingroup$

The difference comes from the distinction between microstates and macrostates. It is the distribution over the microstates that is uniform. However, macrostates encompass many microstates with varying degeneracy. This explains the large discrepancy in their probabilities and why the distribution is typically peaked at the expected value in an accordance to the law of large numbers.

A typical illustration of this is the Ehrenfest urn. Say you have $N$ particles that are in two possible urns (or abstractly, states). The $2^N$ detailed distribution of the $N$ particles among the two urns represent the possible microstates. You can assume to be equiprobable (simply by the microcanonical ensemble, appealing to max ent etc.), so with probability $2^{-N}$. But you have $N$ macrostates representing the global proportion of particles among the balls: namely $M$ balls in one urn and the $N-M$ in the other. These macrostates are not equiprobable. Their probability is now binomial: $2^{-N}\binom NM$. As $N\to\infty$, the distribution will therefore peak at $M=N/2$. You therefore have you two seemingly contradicting aspects: a uniform distribution of microstates, but a very peaked distribution of macrostates.

In fact, it turns out that for typical statistical models, the distribution of microstates is not exactly uniform (canonical ensemble, grand canonical ensemble etc.). However, in the thermodynamic limit, you have asymptotic equipartition. The distribution is essentially supported on a subset where at leading logarithmic behaviour the probability is uniform. This information alone is enough to calculate the "peakedness" (mathematically the large deviations) of the macrostates' distribution.

$\endgroup$
1
  • $\begingroup$Thanks! This helped clear the misunderstanding I had. As a follow up, I'm having a hard time understanding how microstates can be anything but uniform. Wouldn't this violate the big assumption in statistical mechanics that all microstates are equally probable? Also, is there any explanation as to why nature likes to be as uniform as possible?$\endgroup$
    – CBBAM
    CommentedDec 12, 2024 at 20:36

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.