The Nature of Thermodynamic Entropy
2011
Sign up for access to the world's latest research
Abstract
Thermodynamic entropy is a pathway for transitioning from the mechanical world of fundamental physics theory to the world of probabilities, statistics, microstates, and information theory as applied to analyses of the universe. This paper proposes to explain the physical meaning of thermodynamic entropy, and, its connection to Boltzmann’s entropy. The inclusion of Boltzmann’s constant in his definition of entropy establishes the connection. The physical basis of Boltzmann’s constant and his entropy are explained. These explanations first require new analyses of fundamental properties such as force and mass.
Related papers
Universal nature of Boltzmann statistical mechanics, generalized thermodynamics, quantum mechanics, spacetime, black hole mechanics, Shannon information theory, Faraday lines of force, and Banach-Tarski paradox (BTP) are studied. The nature of matter and Dirac anti-matter are described in terms of states of compression and rarefaction of physical space, Aristotle fifth element, or Casimir vacuum identified as a compressible tachyonic fluid. The model is in harmony with perceptions of Plato who believed that the world was formed from a formless primordial medium that was initially in a state of total chaos or "Tohu Vavohu" (Sohrab, in Int J Mech 8:873-84, [1]. Hierarchies of statistical fields from photonic to cosmic scales lead to universal scale-invariant Schrödinger equation thus allowing for new perspectives regarding connections between classical mechanics, quantum mechanics, and chaos theory. The nature of external physical time and its connections to internal thermodynamics time and Rovelli thermal time are described. Finally, some implications of renormalized Planck distribution function to economic systems are examined. Keywords Thermodynamics. Quantum mechanics. Anti-matter. Spacetime. Thermal time. Information theory. Faraday lines of force. Banach-Tarski paradox. T.O.E.
Entropy is the most used and often abused concept in science, but also in philosophy and society. Further confusions are produced by some attempts to generalize entropy with similar but not the same concepts in other disciplines. The physical meaning of phenomenological, thermodynamic entropy is reasoned and elaborated by generalizing Clausius definition with inclusion of generated heat, since it is irrelevant if entropy is changed due to reversible heat transfer or irreversible heat generation. Irreversible, caloric heat transfer is introduced as complementing reversible heat transfer. It is also reasoned and thus proven why entropy cannot be destroyed but is always generated (and thus overall increased) locally and globally, at every space and time scales, without any exception. It is concluded that entropy is a thermal displacement (dynamic thermal-volume) of thermal energy due to absolute temperature as a thermal potential (dQ = TdS), and thus associated with thermal heat and absolute temperature, i.e., distribution of thermal energy within thermal micro-particles in space. Entropy is an integral measure of (random) thermal energy redistribution (due to heat transfer and/or irreversible heat generation) within a material system structure in space, per absolute temperature level: dS = dQ Sys /T = mC Sys dT/T, thus logarithmic integral function, with J/K unit. It may be also expressed as a measure of " thermal disorder " , being related to logarithm of number of all thermal, dynamic microstates W (their position and momenta), S = k B lnW, or to the sum of their logarithmic probabilities S = −k B ∑p i lnp i , that correspond to, or are consistent with the given thermodynamic macro-state. The number of thermal microstates W, is correlated with macro-properties temperature T and volume V for ideal gases. A system form and/or functional order or disorder are not (thermal) energy order/disorder and the former is not related to Thermodynamic entropy. Expanding entropy to any type of disorder or information is a source of many misconceptions. Granted, there are certain benefits of simplified statistical descriptions to better comprehend the randomness of thermal motion OPEN ACCESS Entropy 2014, 16 954 and related physical quantities, but the limitations should be stated so the generalizations are not overstretched and the real physics overlooked, or worse discredited.
Thermodynamics is a physical branch of science that governs the thermal behavior of dynamical systems from those as simple as refrigerators to those as complex as our expanding universe. The laws of thermodynamics involving conservation of energy and nonconservation of entropy are, without a doubt, two of the most useful and general laws in all sciences. The first law of thermodynamics, according to which energy cannot be created or destroyed, merely transformed from one form to another, and the second law of thermodynamics, according to which the usable energy in an adiabatically isolated dynamical system is always diminishing in spite of the fact that energy is conserved, have had an impact far beyond science and engineering. In this paper, we trace the history of thermodynamics from its classical to its postmodern forms, and present a tutorial and didactic exposition of thermodynamics as it pertains to some of the deepest secrets of the universe.
ChemTexts, 2015
In this work, a comprehensive meaning for entropy is provided on the basis of foundations of information theory and statistical thermodynamics. For this purpose, the close relation between missing information and entropy is presented by emphasizing their probabilistic nature. Furthermore, the physical implications of the mathematical properties of the entropy function are exploited using the elementary notions of differential and integral calculus. Particularly, it is evidenced that the usual thermodynamic inequalities found in many textbooks of physical chemistry are direct consequences of the concavity of entropy. The aim of this work is to show that many concepts presented in textbooks of physical chemistry can be obtained in a simple and mathematically clear way. Keywords Entropy Á Statistical thermodynamics Á Information theory Á Concave functions Leading principal minor C p Specific heat at constant pressure C v Specific heat at constant volume a p Coefficient of thermal expansion j T Isothermal compressibility coefficient
This study has demonstrated that entropy is not a physical quantity, that is, the physical quantity called entropy does not exist. If the efficiency of heat engine is defined as η = W/W 1 , and the reversible cycle is considered to be the Stirling cycle, then, given ∮dQ/T = 0, we can prove ∮dW/T = 0 and ∮d/T = 0. If ∮dQ/T = 0, ∮dW/T = 0 and ∮dE/T = 0 are thought to define new system state variables, such definitions would be absurd. The fundamental error of entropy is that in any reversible process, the polytropic process function Q is not a single-valued function of T, in thermodynamics, the P-V diagram should be P-V-T diagram, and the key step of Σ[(ΔQ)/T)] to ∫dQ/T doesn't hold. Similarly, ∮dQ/T = 0, ∮dW/T = 0 and ∮dE/T = 0 do not hold, either. Since the absolute entropy of Boltzmann is used to explain Clausius entropy and the unit (J/K) of the former is transformed from the latter, the non-existence of Clausius entropy simultaneously denies Boltzmann entropy.
International Journal of Geomechanics, 2014
Some implications of a scale invariant model of statistical mechanics to Boltzmann entropy in thermodynamics versus Shannon entropy in information theory are investigated. The objective versus subjective nature of entropy as well as the fundamental significance of the choice of Shannon measure K as Boltzmann constant k is described. In addition, the impact of the results on Nernst-Planck statement of the third law of thermodynamics is discussed.
Foundations of Physics
I will argue, pace a great many of my contemporaries, that there's something right about Boltzmann's attempt to ground the second law of thermodynamics in a suitably amended deterministic time-reversal invariant classical dynamics, and that in order to appreciate what's right about (what was at least at one time) Boltzmann's explanatory project, one has to fully apprehend the nature of microphysical causal structure, time-reversal invariance, and the relationship between Boltzmann entropy and the work of Rudolf Clausius.
WSEAS transactions on computers, 2021
A scale invariant model of statistical mechanics is applied for a comparative study of Boltzmann's entropy in thermodynamics versus Shannon's entropy in information theory. The implications of the model to the objective versus subjective aspects of entropy as well as Nernst-Planck statement of the third law of thermodynamics are also discussed.
WIT Transactions on State-of-the-art in Science and Engineering, 2006
The laws of thermodynamics have a universality of relevance; they encompass widely diverse fields of study that include biology. Moreover the concept of information-based entropy connects energy with complexity. The latter is of considerable current interest in science in general. In the companion chapter in Volume 1 of this series the laws of thermodynamics are introduced, and applied to parallel considerations of energy in engineering and biology. Here the second law and entropy are addressed more fully, focusing on the above issues. The thermodynamic property free energy/exergy is fully explained in the context of examples in science, engineering and biology. Free energy, expressing the amount of energy which is usefully available to an organism, is seen to be a key concept in biology. It appears throughout the chapter. A careful study is also made of the information-oriented 'Shannon entropy' concept. It is seen that Shannon information may be more correctly interpreted as 'complexity' rather than 'entropy'. We find that Darwinian evolution is now being viewed as part of a general thermodynamics-based cosmic process. The history of the universe since the Big Bang, the evolution of the biosphere in general and of biological species in particular are all subject to the operation of the second law of thermodynamics. Our conclusion is that, in contrast to the rather poor 19th century relationship between thermodynamics and biology, a mainstream reconciliation of the two disciplines is now emerging.
American Journal of Physics, 2011
Discussions of the foundations of statistical mechanics, how they lead to thermodynamics, and the appropriate definition of entropy have occasioned many disagreements. I believe that some or all of these disagreements arise from differing, but unstated assumptions, which can make opposing opinions difficult to reconcile. To make these assumptions explicit, I discuss the principles that have guided my own thinking about the foundations of statistical mechanics, the microscopic origins of thermodynamics, and the definition of entropy. The purpose of this paper will be fulfilled if it paves the way to a final consensus, whether or not that consensus agrees with my point of view.

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
References (4)
- Sears, F. W., (ed.) 1960, College Physics, Addison-Wesley (Reading).
- Zemansky, M. W., (ed.) 1943, Heat and Thermodynamics, McGraw Hill, (Reading).
- Fermi, E., (ed.), 1937, Thermodynamics, Prentice-Hall, (Reading).
- Putnam, J. A., http://newphysicstheory.com