Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

NNadir

(34,662 posts)
Sun Dec 10, 2023, 10:43 PM Dec 2023

I have to say that my mind was blown by a book this weekend.

Today my wife and I were killing time in a University Library while waiting for a musical performance and I wandered around the shelves looking for something that might be a fun read (since I don't have privileges in that library). I came across this book: Entropy . I pulled it off the shelf, and in a few hours went through the magnificent Introduction (the first chapter) by cowritten by the editors, and the inspiring mind bending 2nd Chapter, Fundamental Concepts by one of the authors logo Mueller.

I only had three hours with the book, and then the time for the musical event came upon us, and I had to put it down.

After returning home, I downloaded the full text. I could spend the rest of my life with this book, dusting off withered concepts in mathematics about which I have not thought in a long time, but as a practical matter, unless I retire, which I do not want to do, I won't have time

The book was a compilation of papers from a conference held in 2000 because of the realization among workers in entropy that the mathematical language used in specific disciplines were at least partially incomprehensible to workers in other disciplines.

From the Preface:

Imagine the following scene. A probabilist working in stochastic processes, a physicist working in statistical mechanics, an information theorist and a statistician meet and exchange their views about entropy. They are talking about the same notion, and, although they may be using distinct technical variants of it, they will probably understand each other. Someone working in the ergodic theory of dynamical systems may join those four. He will probably have to explain to the others precisely what he means by entropy, but the language he uses is still that of probability (except if he is talking about topological entropy, etc.). In any case, the five of them all see the historical roots of their notion of entropy in the work of Boltzmann and Gibbs. In another group, including people working in thermodynamics, on conservation laws, in rational mechanics and in fluid dynamics, the talk is about entropy from their perspective. However, because they probably see the common root of their concept of entropy in the work of Carnot, Clausius and later Boltzmann, the views they exchange are rather different. Now imagine that both groups come together and each picks up parts of the discussion of the others: they will realize that entropy appears to be an important link between their respective fields of research. But very soon they realize that they have major difficulties in understanding each other-the gap between their notions of entropy seems too large, despite the closely related historical roots and their common interest in gaining a better understanding of complex systems...


Chemists, chemical engineers, and many other kinds of engineers, and certainly physicists and other kinds of scientists all have a working knowledge of the subject of entropy, a fundamental concept in thermodynamics.

One can even get lazy when putting the concept to use, and become detached from the powerful sublimity of the concept and the remarkable intellectual history underlying it.

Fun things I learned in the three hours of reading. Boltzmann never wrote down the equation now inscribed on his tombstone S = k ln(W). Instead, he derived the Boltzmann distribution function as a differential equation that was the sum of two partial derivatives, one in space and the other in time of that function, and thus derived conservation laws in momentum, energy and mass from which the "tombstone Boltzmann equation" falls out with recognition that if one substitutes the number of atoms in a phase space with a consideration of numbers of atoms having unique momenta at particular points in space.

The author makes the point that Gibbs, Maxwell and Boltzmann should have all followed their concepts to a realization and appreciation of quantization of matter, quantum mechanics, by recognizing that space itself is quantized.

From Chapter 2 (Ingo Mueller):

...However, in the long and fierce debate that followed Boltzmann's interpretation of entropy, this nonsuggestive expression was eventually hammered into something suggestive and, moreover, something amenable to extrapolation away from gases.

Actually, 'hammered' is the right word. The main proponents went into a prolonged and acrimonious debate. Polemic was rife and good sense in abeyance. After 50 years, when the dust settled and the fog lifted, one had to realize that quantum mechanics could have started 30 years earlier, if only the founders of thermodynamics-not exempting Boltzmann, Maxwell and Gibbs-had put the facts into the right perspective...
.

Cool. I wish I wasn't running out of time in my life. Now, as an old man, I truly regret the time I wasted.

Have a nice day tomorrow.

11 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
I have to say that my mind was blown by a book this weekend. (Original Post) NNadir Dec 2023 OP
Sounds like a powerful book. byronius Dec 2023 #1
That is truly mind blowing . Permanut Dec 2023 #2
Thanks for the reference to Del Maestro erronis Dec 2023 #6
Fascinating! FM123 Dec 2023 #3
Did you really read it, or... LudwigPastorius Dec 2023 #4
Looks great and dense. But at $137.00 I'll see if a library has a copy.... n/t erronis Dec 2023 #5
Hmm, was looking for background but got lost immediately Bernardo de La Paz Dec 2023 #7
It's all about counting - a physicist's take caraher Dec 2023 #9
You are a good teacher! Thank you Bernardo de La Paz Dec 2023 #11
In the mid twentieth century those same concepts were applied to electrical signals... hunter Dec 2023 #8
It's not unusual for "named" equations to take forms alien to the honoree caraher Dec 2023 #10

Permanut

(6,636 posts)
2. That is truly mind blowing .
Sun Dec 10, 2023, 11:06 PM
Dec 2023

Hawking and Bekenstein had some fascinating ideas about entropy on the large scale, e.g., black holes. Del Maestro from U of Vermont has extrapolated to the atomic scale. As you say, this could be a very long discussion

LudwigPastorius

(10,795 posts)
4. Did you really read it, or...
Mon Dec 11, 2023, 01:34 AM
Dec 2023

are you just a Boltzmann brain that thinks it read it, and thinks it read this post before you will freeze to death in the next instant in the vast void of a dying universe?
😆

Bernardo de La Paz

(50,899 posts)
7. Hmm, was looking for background but got lost immediately
Mon Dec 11, 2023, 02:35 PM
Dec 2023
https://en.wikipedia.org/wiki/Boltzmann's_entropy_formula

I'm having trouble with the concept of microstates. If a microstate is a designation of the position and momentum of a particle, are there not as many microstates as particles? Since each particle's position and momentum are real numbers, different from the other particles' real numbers?

In statistical mechanics, Boltzmann's equation (also known as the Boltzmann–Planck equation) is a probability equation relating the entropy S {displaystyle S} S, also written as S B {displaystyle S_{mathrm {B} }} {displaystyle S_{mathrm {B} }}, of an ideal gas to the multiplicity (commonly denoted as Ω {displaystyle Omega } Omega or W {displaystyle W} W), the number of real microstates corresponding to the gas's macrostate:


S = k ln (W)

Or is W (or Omega) somehow a probability distribution?

Or does it mean a container of gas has more entropy if it contains more gas?

https://en.wikipedia.org/wiki/Microstate_%28statistical_mechanics%29

Note: not anyone's problem except my own; just musing out loud.

caraher

(6,308 posts)
9. It's all about counting - a physicist's take
Sat Dec 16, 2023, 03:01 AM
Dec 2023

Last edited Sat Dec 16, 2023, 09:44 AM - Edit history (1)

The number of particles is indeed related to the number of microstates accessible to a system with a given energy, but the number of microstates depends on the total energy of the system as well as the number of particles. Hyperphysics has a nice introduction. Indeed, the key concepts are really all on that one page.

Statistical mechanics textbooks typically call Omega/W "mulitiplicity" and often begin by considering systems for which it is easy to count the ways a known, fixed total amount of energy can be distributed among elements of the system. So most modern approaches to the subject tend to begin with studying "Einstein solids," which have a number of features that simplify the treatment. The basic idea is that an Einstein solid (a solid modeled with atoms as identical balls connected by identical springs) consists of some number of quantum harmonic oscillators amongst which the total thermal energy of the system (which is simply the total energy of all the oscillators) can be distributed at random among the oscillators. A feature of the quantum harmonic oscillator is that the energy of one oscillator can only take one of a set of discrete values separated by hf (where h is a universal value known as Planck's constant and f is the frequency of the oscillator). (We can ignore the zero point energy, which is intrinsic to each oscillator and cannot be "transferred" from one element of the system to another).

So to give a concrete example, suppose you have 3 oscillators and a total energy of 1*hf. To find Omega, you simply count the number of ways to distribute the one unit of energy among the three oscillators. The answer is plainly 3 - the energy can be in oscillator 1 (that's one microstate), or in oscillator 2 (that's a second microstate), or in oscillator 3 (that's the third and final microstate). In this instance your intuition is basically correct - the number of microstates would equal the number of oscillators:

100 (1st microstate)
010 (2nd microstate)
001 (3rd microstate)

So Omega = 3.

But the picture changes... and this is all just combinatorics... if you have more energy. Suppose the total energy is instead 2*hf. There are three ways to distribute the energy in which one oscillator has all this energy and the other two have none, and there are three ways to have two oscillators have one unit of energy each with the third having zero, for a total of six possible microstates:

200
020
002
110
101
011

So here, Omega=6

Of course, listing and counting in this fashion becomes impracticable for systems of any significant size. Applying combinatorics to this results in a nice formula you can find on this Hyperphysics explanation page.

One problem with multiplicity is that the numerical values you obtain from using it become insanely large rather quickly. Taking the natural log tames their behavior and has other side benefits for thermodynamic analysis. In particular, S, unlike Omega, behaves as you would expect an extensive property to behave when you look at combining two small systems into one larger system - the entropies of the subsystems simply add to give the total entropy of the system.

Now a lot of your questions pertain to considering gases. The counting of microstates becomes much more challenging than for an Einstein solid, partly because there are subtleties associated with properly accounting for the quantum properties of the ideal gas in a box and partly because even for one particle, there are multiple degrees of freedom. Rather than energies associated with oscillation, the thermal energy of a gas is usually just the sum of all the mechanical energies of the molecules. If we limit ourselves to a monatomic gas (to evade thinking about molecular rotations and vibrations, which also contribute to thermal energy of a gas) we have three degrees of freedom, arising from the x, y, and z components of the atom's motion. If you had a one-atom "gas" there would then be essentially three ways to divide a given amount of kinetic energy, associated with each component of its velocity. I don't remember offhand exactly how the ideal gas model treats the division of that kinetic energy into countable "chunks" but conceptually one does something similar to the oscillator case, counting the number of ways the total thermal energy can be divided among the various atoms and the various directions of travel they might have.

The result is given by the somewhat complicated-looking Sackur-Tetrode Equation. You can see that there is an almost-direct proportionality with N, the number of atoms in the gas, but not quite (as N also occurs inside the natural log term). So yes, the entropy does depend on the number of gas molecules present, for a given container and total thermal energy, and rises with the number of molecules.

Finally, you ask whether Omega is somehow a probability distribution. It is not - it is just a number. But combined with a key postulate of statistical mechanics, it is linked to probability in important ways. The relevant postulate is that in equilibrium, every microstate is exactly as probably as any other microstate. Combined with the concept of a macrostate (explained below), this allows us to make predictions concerning the probability of observing a system in a given macrostate.

Returning to the Einstein solid, imagine two such systems, A and B, in thermal equilibrium with one another, such that the total energy of the complete system is U. A and B can, in general, have different numbers of oscillators, and the energy U can be split in many ways. Energy conservation demands only that U_A + U_B = U - that the thermal energy of the full system equals the sum of the thermal energies of A and B.

So let's think about solid A. It has some fixed number of oscillators N_A, but could have any amount of thermal energy ranging from 0 to U (in steps of hf). A "macrostate" in this example, for solid A, is any one of the allowed values of U_A, and given a value of U_A we can calculate Omega_A - the number of microstates associated with the macrostate U_A.

Since we have postulated that all microstates are equally probable, we should expect that the distribution of energy between solids A and B that is most probable will be the one that corresponds to the largest number of microstates (and thus maximum entropy). There's a nifty calculator at the Six Ideas web site that calculates tables and graphs of multiplicities and probabilities for small-ish Einstein solids. In the end what you'll consistently see is that the combination of macrostates for A and B that results in the largest number of microstates is always the one that most evenly distributes energy among the individual oscillators - which feels right intuitively for equilibrium. And that, in turn, is the configuration that maximizes entropy for a given amount of thermal energy.


Bernardo de La Paz

(50,899 posts)
11. You are a good teacher! Thank you
Sat Dec 16, 2023, 06:16 AM
Dec 2023

I was able to read about half way through just building your bricks up, and that is much clearer than it ever has been for me.

Now I need to do some more study, at the link you gave, and come back to the rest of your explanation.

Thanks for getting me started out by lifting me out of the quicksand and onto stable ground!

hunter

(38,930 posts)
8. In the mid twentieth century those same concepts were applied to electrical signals...
Mon Dec 11, 2023, 04:20 PM
Dec 2023

... by Claude Shannon.

The modern high speed internet is built upon these theoretical foundations.

The concept has also proven useful in ecological studies.

All that and more looks to be covered in part IV of this book.

caraher

(6,308 posts)
10. It's not unusual for "named" equations to take forms alien to the honoree
Sat Dec 16, 2023, 03:08 AM
Dec 2023
Fun things I learned in the three hours of reading. Boltzmann never wrote down the equation now inscribed on his tombstone S = k ln(W).


In a similar vein, Maxwell never wrote the equations of electromagnetism in the forms we know today (I believe Heaviside is credited with the modern notation). This causes some textbook authors to argue for calling them "the Maxwell equations" rather than the more common "Maxwell's Equations" on the basis that the former honors Maxwell's work without implying that the exact equations we use today were "his" equations.

I'm OK with either, personally, but the evolution of notation is fascinating (not to mention the way Maxwell arrived at them, which is very different from the reasoning used to derive/justify them in modern expositions).
Latest Discussions»Culture Forums»Science»I have to say that my min...