Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Science

Showing Original Post only (View all)

NNadir

(34,662 posts)
Sun Dec 10, 2023, 10:43 PM Dec 2023

I have to say that my mind was blown by a book this weekend. [View all]

Today my wife and I were killing time in a University Library while waiting for a musical performance and I wandered around the shelves looking for something that might be a fun read (since I don't have privileges in that library). I came across this book: Entropy . I pulled it off the shelf, and in a few hours went through the magnificent Introduction (the first chapter) by cowritten by the editors, and the inspiring mind bending 2nd Chapter, Fundamental Concepts by one of the authors logo Mueller.

I only had three hours with the book, and then the time for the musical event came upon us, and I had to put it down.

After returning home, I downloaded the full text. I could spend the rest of my life with this book, dusting off withered concepts in mathematics about which I have not thought in a long time, but as a practical matter, unless I retire, which I do not want to do, I won't have time

The book was a compilation of papers from a conference held in 2000 because of the realization among workers in entropy that the mathematical language used in specific disciplines were at least partially incomprehensible to workers in other disciplines.

From the Preface:

Imagine the following scene. A probabilist working in stochastic processes, a physicist working in statistical mechanics, an information theorist and a statistician meet and exchange their views about entropy. They are talking about the same notion, and, although they may be using distinct technical variants of it, they will probably understand each other. Someone working in the ergodic theory of dynamical systems may join those four. He will probably have to explain to the others precisely what he means by entropy, but the language he uses is still that of probability (except if he is talking about topological entropy, etc.). In any case, the five of them all see the historical roots of their notion of entropy in the work of Boltzmann and Gibbs. In another group, including people working in thermodynamics, on conservation laws, in rational mechanics and in fluid dynamics, the talk is about entropy from their perspective. However, because they probably see the common root of their concept of entropy in the work of Carnot, Clausius and later Boltzmann, the views they exchange are rather different. Now imagine that both groups come together and each picks up parts of the discussion of the others: they will realize that entropy appears to be an important link between their respective fields of research. But very soon they realize that they have major difficulties in understanding each other-the gap between their notions of entropy seems too large, despite the closely related historical roots and their common interest in gaining a better understanding of complex systems...


Chemists, chemical engineers, and many other kinds of engineers, and certainly physicists and other kinds of scientists all have a working knowledge of the subject of entropy, a fundamental concept in thermodynamics.

One can even get lazy when putting the concept to use, and become detached from the powerful sublimity of the concept and the remarkable intellectual history underlying it.

Fun things I learned in the three hours of reading. Boltzmann never wrote down the equation now inscribed on his tombstone S = k ln(W). Instead, he derived the Boltzmann distribution function as a differential equation that was the sum of two partial derivatives, one in space and the other in time of that function, and thus derived conservation laws in momentum, energy and mass from which the "tombstone Boltzmann equation" falls out with recognition that if one substitutes the number of atoms in a phase space with a consideration of numbers of atoms having unique momenta at particular points in space.

The author makes the point that Gibbs, Maxwell and Boltzmann should have all followed their concepts to a realization and appreciation of quantization of matter, quantum mechanics, by recognizing that space itself is quantized.

From Chapter 2 (Ingo Mueller):

...However, in the long and fierce debate that followed Boltzmann's interpretation of entropy, this nonsuggestive expression was eventually hammered into something suggestive and, moreover, something amenable to extrapolation away from gases.

Actually, 'hammered' is the right word. The main proponents went into a prolonged and acrimonious debate. Polemic was rife and good sense in abeyance. After 50 years, when the dust settled and the fog lifted, one had to realize that quantum mechanics could have started 30 years earlier, if only the founders of thermodynamics-not exempting Boltzmann, Maxwell and Gibbs-had put the facts into the right perspective...
.

Cool. I wish I wasn't running out of time in my life. Now, as an old man, I truly regret the time I wasted.

Have a nice day tomorrow.

11 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Latest Discussions»Culture Forums»Science»I have to say that my min...»Reply #0