Science
Related: About this forumI have to say that my mind was blown by a book this weekend.
Today my wife and I were killing time in a University Library while waiting for a musical performance and I wandered around the shelves looking for something that might be a fun read (since I don't have privileges in that library). I came across this book: Entropy . I pulled it off the shelf, and in a few hours went through the magnificent Introduction (the first chapter) by cowritten by the editors, and the inspiring mind bending 2nd Chapter, Fundamental Concepts by one of the authors logo Mueller.
I only had three hours with the book, and then the time for the musical event came upon us, and I had to put it down.
After returning home, I downloaded the full text. I could spend the rest of my life with this book, dusting off withered concepts in mathematics about which I have not thought in a long time, but as a practical matter, unless I retire, which I do not want to do, I won't have time
The book was a compilation of papers from a conference held in 2000 because of the realization among workers in entropy that the mathematical language used in specific disciplines were at least partially incomprehensible to workers in other disciplines.
From the Preface:
Chemists, chemical engineers, and many other kinds of engineers, and certainly physicists and other kinds of scientists all have a working knowledge of the subject of entropy, a fundamental concept in thermodynamics.
One can even get lazy when putting the concept to use, and become detached from the powerful sublimity of the concept and the remarkable intellectual history underlying it.
Fun things I learned in the three hours of reading. Boltzmann never wrote down the equation now inscribed on his tombstone S = k ln(W). Instead, he derived the Boltzmann distribution function as a differential equation that was the sum of two partial derivatives, one in space and the other in time of that function, and thus derived conservation laws in momentum, energy and mass from which the "tombstone Boltzmann equation" falls out with recognition that if one substitutes the number of atoms in a phase space with a consideration of numbers of atoms having unique momenta at particular points in space.
The author makes the point that Gibbs, Maxwell and Boltzmann should have all followed their concepts to a realization and appreciation of quantization of matter, quantum mechanics, by recognizing that space itself is quantized.
From Chapter 2 (Ingo Mueller):
Actually, 'hammered' is the right word. The main proponents went into a prolonged and acrimonious debate. Polemic was rife and good sense in abeyance. After 50 years, when the dust settled and the fog lifted, one had to realize that quantum mechanics could have started 30 years earlier, if only the founders of thermodynamics-not exempting Boltzmann, Maxwell and Gibbs-had put the facts into the right perspective...
Cool. I wish I wasn't running out of time in my life. Now, as an old man, I truly regret the time I wasted.
Have a nice day tomorrow.
byronius
(7,598 posts)Thanks for the tip!
Permanut
(6,636 posts)Hawking and Bekenstein had some fascinating ideas about entropy on the large scale, e.g., black holes. Del Maestro from U of Vermont has extrapolated to the atomic scale. As you say, this could be a very long discussion
erronis
(16,827 posts)I think he is at the University of Tennessee but works with UVM.
https://www.delmaestro.org/adrian/
FM123
(10,126 posts)LudwigPastorius
(10,795 posts)are you just a Boltzmann brain that thinks it read it, and thinks it read this post before you will freeze to death in the next instant in the vast void of a dying universe?
😆
erronis
(16,827 posts)Bernardo de La Paz
(50,899 posts)I'm having trouble with the concept of microstates. If a microstate is a designation of the position and momentum of a particle, are there not as many microstates as particles? Since each particle's position and momentum are real numbers, different from the other particles' real numbers?
S = k ln (W)
Or is W (or Omega) somehow a probability distribution?
Or does it mean a container of gas has more entropy if it contains more gas?
https://en.wikipedia.org/wiki/Microstate_%28statistical_mechanics%29
Note: not anyone's problem except my own; just musing out loud.
caraher
(6,308 posts)Last edited Sat Dec 16, 2023, 09:44 AM - Edit history (1)
The number of particles is indeed related to the number of microstates accessible to a system with a given energy, but the number of microstates depends on the total energy of the system as well as the number of particles. Hyperphysics has a nice introduction. Indeed, the key concepts are really all on that one page.
Statistical mechanics textbooks typically call Omega/W "mulitiplicity" and often begin by considering systems for which it is easy to count the ways a known, fixed total amount of energy can be distributed among elements of the system. So most modern approaches to the subject tend to begin with studying "Einstein solids," which have a number of features that simplify the treatment. The basic idea is that an Einstein solid (a solid modeled with atoms as identical balls connected by identical springs) consists of some number of quantum harmonic oscillators amongst which the total thermal energy of the system (which is simply the total energy of all the oscillators) can be distributed at random among the oscillators. A feature of the quantum harmonic oscillator is that the energy of one oscillator can only take one of a set of discrete values separated by hf (where h is a universal value known as Planck's constant and f is the frequency of the oscillator). (We can ignore the zero point energy, which is intrinsic to each oscillator and cannot be "transferred" from one element of the system to another).
So to give a concrete example, suppose you have 3 oscillators and a total energy of 1*hf. To find Omega, you simply count the number of ways to distribute the one unit of energy among the three oscillators. The answer is plainly 3 - the energy can be in oscillator 1 (that's one microstate), or in oscillator 2 (that's a second microstate), or in oscillator 3 (that's the third and final microstate). In this instance your intuition is basically correct - the number of microstates would equal the number of oscillators:
100 (1st microstate)
010 (2nd microstate)
001 (3rd microstate)
So Omega = 3.
But the picture changes... and this is all just combinatorics... if you have more energy. Suppose the total energy is instead 2*hf. There are three ways to distribute the energy in which one oscillator has all this energy and the other two have none, and there are three ways to have two oscillators have one unit of energy each with the third having zero, for a total of six possible microstates:
200
020
002
110
101
011
So here, Omega=6
Of course, listing and counting in this fashion becomes impracticable for systems of any significant size. Applying combinatorics to this results in a nice formula you can find on this Hyperphysics explanation page.
One problem with multiplicity is that the numerical values you obtain from using it become insanely large rather quickly. Taking the natural log tames their behavior and has other side benefits for thermodynamic analysis. In particular, S, unlike Omega, behaves as you would expect an extensive property to behave when you look at combining two small systems into one larger system - the entropies of the subsystems simply add to give the total entropy of the system.
Now a lot of your questions pertain to considering gases. The counting of microstates becomes much more challenging than for an Einstein solid, partly because there are subtleties associated with properly accounting for the quantum properties of the ideal gas in a box and partly because even for one particle, there are multiple degrees of freedom. Rather than energies associated with oscillation, the thermal energy of a gas is usually just the sum of all the mechanical energies of the molecules. If we limit ourselves to a monatomic gas (to evade thinking about molecular rotations and vibrations, which also contribute to thermal energy of a gas) we have three degrees of freedom, arising from the x, y, and z components of the atom's motion. If you had a one-atom "gas" there would then be essentially three ways to divide a given amount of kinetic energy, associated with each component of its velocity. I don't remember offhand exactly how the ideal gas model treats the division of that kinetic energy into countable "chunks" but conceptually one does something similar to the oscillator case, counting the number of ways the total thermal energy can be divided among the various atoms and the various directions of travel they might have.
The result is given by the somewhat complicated-looking Sackur-Tetrode Equation. You can see that there is an almost-direct proportionality with N, the number of atoms in the gas, but not quite (as N also occurs inside the natural log term). So yes, the entropy does depend on the number of gas molecules present, for a given container and total thermal energy, and rises with the number of molecules.
Finally, you ask whether Omega is somehow a probability distribution. It is not - it is just a number. But combined with a key postulate of statistical mechanics, it is linked to probability in important ways. The relevant postulate is that in equilibrium, every microstate is exactly as probably as any other microstate. Combined with the concept of a macrostate (explained below), this allows us to make predictions concerning the probability of observing a system in a given macrostate.
Returning to the Einstein solid, imagine two such systems, A and B, in thermal equilibrium with one another, such that the total energy of the complete system is U. A and B can, in general, have different numbers of oscillators, and the energy U can be split in many ways. Energy conservation demands only that U_A + U_B = U - that the thermal energy of the full system equals the sum of the thermal energies of A and B.
So let's think about solid A. It has some fixed number of oscillators N_A, but could have any amount of thermal energy ranging from 0 to U (in steps of hf). A "macrostate" in this example, for solid A, is any one of the allowed values of U_A, and given a value of U_A we can calculate Omega_A - the number of microstates associated with the macrostate U_A.
Since we have postulated that all microstates are equally probable, we should expect that the distribution of energy between solids A and B that is most probable will be the one that corresponds to the largest number of microstates (and thus maximum entropy). There's a nifty calculator at the Six Ideas web site that calculates tables and graphs of multiplicities and probabilities for small-ish Einstein solids. In the end what you'll consistently see is that the combination of macrostates for A and B that results in the largest number of microstates is always the one that most evenly distributes energy among the individual oscillators - which feels right intuitively for equilibrium. And that, in turn, is the configuration that maximizes entropy for a given amount of thermal energy.
Bernardo de La Paz
(50,899 posts)I was able to read about half way through just building your bricks up, and that is much clearer than it ever has been for me.
Now I need to do some more study, at the link you gave, and come back to the rest of your explanation.
Thanks for getting me started out by lifting me out of the quicksand and onto stable ground!
hunter
(38,930 posts)... by Claude Shannon.
The modern high speed internet is built upon these theoretical foundations.
The concept has also proven useful in ecological studies.
All that and more looks to be covered in part IV of this book.
caraher
(6,308 posts)In a similar vein, Maxwell never wrote the equations of electromagnetism in the forms we know today (I believe Heaviside is credited with the modern notation). This causes some textbook authors to argue for calling them "the Maxwell equations" rather than the more common "Maxwell's Equations" on the basis that the former honors Maxwell's work without implying that the exact equations we use today were "his" equations.
I'm OK with either, personally, but the evolution of notation is fascinating (not to mention the way Maxwell arrived at them, which is very different from the reasoning used to derive/justify them in modern expositions).