Jump to content

Talk:Entropy/Archive 11

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 5Archive 9Archive 10Archive 11Archive 12Archive 13Archive 14

Entropy models

Today, after coming across another "dropping an egg on the floor" explanation of entropy, I began making a table (in progress) of the various oft-cited ‘entropy models’ used as teaching heuristics throughout history, listed chronologically:

Feel free to leave comment if you see any that I’ve left out or forgotten. --Libb Thims (talk) 17:12, 18 July 2010 (UTC)

Clausius coined the term entropy in 1865 (not 1850)

The date of coining and the description of the coining of the term entropy are both incorrect (citing the Online Etymology Dictionary over that of the original source):

I would suggest that someone fix this misinformation, which has been in the article for at least a year or two now. --Libb Thims (talk) 17:53, 18 July 2010 (UTC)

Lousy Opening Paragraph

This article should give a simple, clear definition of entropy in the opening paragraph so that the concept is easily understood by those unfamiliar with it. Rather than doing this however the article abruptly jumps into specialist terminology and calculus equations, thereby having the unfortunate effect of alienating audiences unfamiliar with advanced physics. Frankly, the opening paragraph needs a complete rewrite, as much for the sake of clarity and coherence as to make the information accessible to a wider audience 64.222.110.155 (talk) 19:25, 22 February 2010 (UTC)

Editors, Please don't jump into too formal a definition in the first line of the introduction. Though this may be academically correct it does nothing for the general reader who just wants to know what entropy is. An encyclopedia needs to talk to all levels of understanding especially in the introduction. I have added the opening of simple wikipedia as a start here. Lumos3 (talk) 11:06, 2 March 2010 (UTC)
I left out the word energy in your sentence. Entropy of a isolated system merely the logarithm of the number of ways you can arrange a system. Then, of course, due to conservation of energy this then also leads to the picture of entropy as the number of ways you can distribute energy over the degrees of freedom of a system, but then this is less precise statement and it is unecessarily complicated for lay persons. Count Iblis (talk) 12:49, 2 March 2010 (UTC)

I don't know what is the best way to introduce entropy in this article, but I do know the following:

  • There is no introductory paragraph that will ever explain the concept of entropy. All we can do is give the first time reader an inkling of what it means.
  • To Count Iblis I could say, no, the infinitesimal change in the entropy of a system is equal to the infinitesimal heat energy added to the system divided by the temperature, with everything else being held constant. By the process of integration, the change in entropy of a system can be found. This makes no reference to microstates, statistical mechanics, or "total entropy", and none is needed in order to form a consistent theory of thermodynamics.
  • Thermodynamics is a self-contained, self-consistent theory which does not require a statistical mechanical explanation for its validity. Statistical mechanics provides an explanation of thermodynamic entropy (change) in terms of the microstates and macrostates of a system. Entropy change was perfectly well defined before the microstate was ever thought of.
  • There is no such thing as "total entropy". Thermodynamically, only changes in entropy are dealt with. In statistical mechanics, the microstate is fairly well defined, but the macrostate is up for grabs. If we declare a set of thermodynamic variables to define a macrostate, then fine, thermodynamics and statistical mechanics will give us the "entropy" and everything will be consistent. If we discover or introduce a new variable that divides that macrostate into a number of different macrostates, then fine, the entropy provided by thermodynamics and statistical mechanics will change, but everything will still be consistent.

My question is - how long do we blur the distinction between the thermodynamic definition of entropy and statistical explanation of entropy in the name of simplicity? How long do we pretend that there is a number called "Entropy" that we can assign to a system that has some absolute meaning like temperature or pressure? I agree, this distinction need not be made immediately, but we do a disservice to carry on as if statistical mechanics and thermodynamics are two names for the same theory, and that total entropy is some rigorous physical quantity that, if we only knew enough, we could write down an absolute value for. PAR (talk) 16:29, 2 March 2010 (UTC)

I really must agree. I have a background which includes thermodynamics, and would say quite a reasonable grasp of basic entropy, which I actually use regularly. However large parts of the introduction makes no sense to me! Take this part (cut somewhat out of context):

If thermodynamic systems are described using thermal energy instead of temperature, then entropy is just a number by which the thermal energy in the system is multiplied. The resulting energy is an energy for which no information is available which would be required to convert the energy in technical systems from one form (e.g. electrical) into another form (e.g. mechanical).

I don't get the first sentence (then again, I can't remember using entropy in such a system, at least not since graduating. Maybe it would make sense if I picked up a textbook. But then, why would I need Wikipedia?), and the second is simply unintelligible. As I read it, the resulting energy is ... An energy, which could only be converted to another form, with the use of some information, which sadly is not available? Say what? The complete lack of punctuation might add to this, but I *really* don't get it, even though I've read it many times now. Thermal energy is continually converted between internal energy and radiation, that does not require any information to happen, and it does not necessarily change the entropy of an isolated system either.
The fact that the intro seems to be information-centric, might not help me, as I primarily understand and use entropy as dS = dQ/T. Maybe if I had studied entropy as part of information-theory, the words would ring a bell and make some sense. But it can't be a prerequisite, that the issue be already understood, to be able to understand the introduction to an encyclopedic article on the subject! Forgetting for the moment, that the subject of the article was NOT entropy in information-theory, but rather thermodynamic entropy.
Even IF entropy fundamentally is a concept based in information theory (and I'm not agreeing here!), saying that an article on thermodynamic entropy thus has to start out in information-theory, is like saying that inertia and acceleration is fundamentally based in special relativity (and quantum mechanics btw.), and any article on inertia or Newtons laws, will have to start out by defining the terms from a special relativity framework. That's *way* beyond reasonable.
I'm not knocking using "randomness" or "number of possible states" to illustrate entropy, they are helpful to get a picture of what entropy is about. But talking about information content in an thermodynamic context, does not help lay people. No one has an intuitive understanding about the information content of a cup of coffee, and you'd get random answers if you ask about how it changes when you stir it. But everyone has an intuitive understanding of the coffees temperature, and could tell you that stirring it requires work. Those concepts might not be rigorous, but neither are time and mass, and I bet they are used in a few places around here, without mentioning that fact.
Another quote:

In technical applications, machines are basically energy conversion devices. Thus, such devices only can be driven by convertible energy. The same applies to biological organisms. The product of thermal energy (or the equivalents of thermal energy) and entropy is "already converted energy".

I'm just lost. I assume that "convertible energy", is not the energy felt driving an open car, and that the meaning was ..such devices can only be..., not "such devices only, can be..." but then I still stumble. For starters, what energy is then not convertible?
From a thermodynamic point of view, my best guess is the writer thought, that thermal energy, at a temperature of the lowest available reservoir, inside an isolated system, would be non-convertible? Thats a lot of abstraction (and assumption on my part, but I'm trying hard here!) up front, but it's still not true. That energy may easily be converted (although it may not perform any work). As I read it, the sentence just doesn't make any sense, and had there not been such a heated talk page, I would have just deleted it.
Maybe someone with an information-theory background can explain the wording, and I'd be interested to hear, but of course in the end, it has to be understandable without explanation, by a layman.
I'm sorry this is all not very constructive. But given the record here, I'm reluctant to chance anything without writing here first. Richard 188.180.111.132 (talk) 21:08, 8 April 2010 (UTC)

The quotation at the beginning is taken out of context. It is a statement that the discoverer of the covalent bond found hard to understand. If Lewis found that statement difficult to understand, why is it the opening passage of page without at least a warning that even experts find that hard to understand? That whole introductory section contains no useful overview information and should be removed. —Preceding unsigned comment added by 24.16.68.149 (talk) 21:32, 2 June 2010 (UTC)

I think it is time for another rewrite. To adress the old comment by PAR above that you can define entropy without invoking information theoretical concepts, microstates etc., I would say that this is not really true, because you then have to appeal to vague concepts like "disorder". Also, what is "temperature", what is "heat"? these cpncepts that are not rigorously defined in classical thermodynamics that are assumed to be given in an ad-hoc way in practical situations (like heat engines).

I started a rewrite here. I rewrote the stuff till the grand-canonical ensemble. But note that the explanation in the first paragraphs is a bit lacking, I need to invoke ergodicity, the fact that this is not proven etc. etc.. As things stand there, the explanations are a bit misleading and a bit too simplistic. Count Iblis (talk) 15:06, 27 August 2010 (UTC)

You CAN define entropy without reference to information theory, microstates, disorder, etc. Clausius did it. But there is a difference between defining entropy experimentally and understanding what it means, and why it is important. In order to do this, information theory, microstates, etc. are indispensable. Pressure can be experimentally defined also as some mysterious force that acts on a surface (thermodynamics), but you don't really "get it" until you have the model of individual atoms or molecules randomly striking a surface and imparting momentum to that surface (statistical mechanics). Same sort of thing with entropy. Clausius defined it experimentally (thermodynamics), Boltzmann revealed the inner workings, the atomistic model that helps you "get it" (statistical mechanics). PAR (talk) 02:49, 28 August 2010 (UTC)

Coinage of the term entropy

Clausius coined the term entropy in 1865. And it is misinformed in this article and so I am correcting it.Nobleness of Mind (talk) 13:41, 19 July 2010 (UTC)

Mixture of energy and matter

To me entropy means that matter and energy (like the salt and pepper analogy) can never be completely separated. Is this a postulation upheld by the law of entropy or am I missing the point? —Preceding unsigned comment added by 165.212.189.187 (talk) 14:29, 17 August 2010 (UTC) By completely I mean all the energy in the universe and all the matter in the universe. —Preceding unsigned comment added by 165.212.189.187 (talk) 14:32, 17 August 2010 (UTC)

Section "Entropy versus heat and temperature"

I don't understand what is the meaning of this section. It should be incorporated into other parts of the article or entirely deleted.--Netheril96 (talk) 03:40, 5 October 2010 (UTC)

Entropy is not just increase of disorder

It's become a cliche to say that entropy is a measure of disorder, but that's not generally true. Imagine a large cloud of gas and dust collapsing by gravity into a planet. Imagine a crystal forming by molecular attraction. The fundamental change is that energy availble to do work is being lost. You could theoretically harness infalling gas and dust to do work, but once it's formed into a tightly packed planet, that opportunity is gone. The idea of disorder increasing is associated with the theory of ideal gas, it's not the general rule. DonPMitchell (talk) 16:41, 5 October 2010 (UTC)

I don't understand your point. The "disorder" thing is just a way to understand entropy, but it is not an formal definition, which is exactly how this article addresses the problem. In fact, "disorder" per se is not a well-defined concept.--Netheril96 (talk) 04:40, 6 October 2010 (UTC)
I don't think it's a good idea to stray too much into the realm of exergy either. We link to it in the article, but not particularly prominently. Perhaps the article could be jiggered a little. --Izno (talk) 06:17, 6 October 2010 (UTC)

Scope (and lede)

This isn't a "general article on entropy". That might look something like the article on Scholarpedia. What we have here is an article on entropy in thermodynamics. The treatment of entropy in information theory is cursory and minimal, the bare minimum relevant for context as an interpretation of thermodynamic entropy -- and that is as it should be, because we have had long discussions about this in the past.

If people want to know about Shannon entropy, they should be directed to Shannon entropy, where we have a real article on the subject. It does nobody any favours to present this article as an article on Shannon entropy, nor on entropy generally -- it just isn't what this article does.

The lede should present what this article is about, and where its focus is. Shannon entropy is not what this article is about; it certainly isn't where its focus is. It's hard enough to present a coherent summary of thermodynamic entropy in four paragraphs (it's defeated us for seven years, and what is there at the moment is a long way even from the best we've done). Putting in an uncontextualised and unmotivated line about Shannon entropy frankly doesn't help. Jheald (talk) 20:09, 26 October 2010 (UTC)

The article presents the information theory aspect in substantial fashion and this should be included in the lede to display the alternate use of the term as reflected in the article. Kbrose (talk) 20:41, 26 October 2010 (UTC)
No, the article doesn't, quite frankly. The idea is introduced half-heartedly in the very last paragraphs of the piece. As I keep saying, if you want a overview listing of entropy in all its forms and meanings, compare the Scholarpedia article. Our article is nothing like that; and the balance of our article is nothing like that; because this is an article on entropy in thermodynamics. Jheald (talk) 21:17, 26 October 2010 (UTC)
I agree with Jheald. We can not mention everything that is in the article in the lede. I also agree that this is very hard to get right. Entropy is understood in different ways by different people, and more importantly readers need to get an appreciation of it at a very wide range of levels. --Bduke (Discussion) 21:09, 26 October 2010 (UTC)

Maths A-class

I've closed the very old Wikipedia:WikiProject Mathematics/A-class rating/Entropy as no consensus due to age and lack of input in the discussion.--Salix (talk): 17:54, 6 November 2010 (UTC)

Entropy in chemical reactions

This article could do with a section discussing entropy changes in chemical reactions -- in particular, the relation between ΔS and ΔG (entropy of the system and entropy of the surroundings); the effect of temperature on the importance of the ΔS term, and therefore whether the reaction can go forward or not; and, especially, what makes a chemical reaction "entropically favourable" -- i.e. what sort of reactions have a positive ΔS.

This is material worth treating here in its own right; but I think the discussion of what makes ΔS positive would also, by introducing a concrete example, make the discussion of the microscopic picture of entropy much more real (i.e. why it is that releasing a mole of gas tends to make a reaction entropically favourable); and, also, usefully give another macroscopic point of contact with entropy, other than just heat engines.

A good section on "entropy changes in chemical reactions" would I think therefore add very valuably to the article.

I think it would be helpful in the lead too. Having introduced entropy via a heat engine view (relating to energy not available to do useful work), then the microscopic view, it seems to me that following those two with a third paragraph, introducing the idea of entropy change in chemical reaction, would round out the lead well, and make clearer the significance and meaning of entropy as an idea. Jheald (talk) 12:57, 10 December 2010 (UTC)

I agree that these things should be mentioned. I also think, thought, that one has to put more weight on the information theoretical foundations of entropy. Contrary to popular belief, thermodynamics, stuff like maximum work etc. etc. is far easier to explain within the information theoretical framework than in the classical thermodynamics framework. Count Iblis (talk) 00:54, 13 December 2010 (UTC)

Entropy from "nothing"? I don't think so....

The Wikipedia article on "Entropy" states in the first line:

"Entropy is a thermodynamic property that is a measure of the energy not available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines."

This statement is interesting because, as we know, Entropy is not measured in units of Energy, but rather, as a Change of Energy divided by Temperature.

I also observed the Fundamental thermodynamic relation:

∆U = T∆S - p∆V

I realized that the equation somehow implied for the same change in internal energy, ∆U, that the amount of p∆V was only limited by the value of T∆S. The mainstream idea is that increasing entropy somehow increases the amount of energy not available to the system. However, if this were true, then temperature must decrease faster than entropy increases, for by doing so, if S increased, then the sum of ∆U and p∆V would not have to increase. If ∆U is the change of internal energy, then it is easy to see that P∆V is the change in external energy.


Changing a pressure in a constant volume is like changing a force without changing the displacement—by itself, there is no work associated with such forces. Of course you cannot just change the pressure in a system of constant volume without changing the forces in that volume. Fundamentally, the change in pressure in a "constant" volume is really the result of the changes in the proximal distance between particles in that system. The hotter particles in the volume are, then the greater the variation there is in the distances between those particles, and because of the inverse square law and the fact that the root mean square of a set of unequal real values is always greater than the average, the higher the average forces between the particles, even if the average proximal distance between particles does not change. Therefore, in a sense, V∆p at one scale is actually p∆V at a smaller scale. V∆p is otherwise, implicitly, a part of internal energy, ∆U.


Thus, it is obvious that T∆S = ∆U + p∆V is the change of total energy.

If S is conserved between systems, then it is easy to deduce a direct, linear relationship between temperature and energy of a given system, which is exactly what one expects from the kinetic theory of heat:

∆U = T∆S - p∆V
∆U + p∆V = T∆S
(∆U + p∆V)/T = ∆S

Where ∆S is the entropy change of one system at the expense of another (i.e. an amount of entropy "in transit").

Also notice that if T decreases with time, for a given entropy "in transit" (∆S), the total energy "in transit" (∆U + p∆V) literally decreases, which corresponds directly with requirement that such energy becomes unavailable to do work. Technically, this would make T∆S equivalent to exergy rather than the energy. So there is no need to assume that the total entropy of a universe must increase to explain this. All that is required is that entropy is transferred from one domain to another domain of a lower temperature.

The first line of the article on fundamental thermodynamic relation states: "In thermodynamics, the fundamental thermodynamic relation expresses an infinitesimal change in internal energy in terms of infinitesimal changes in entropy, and volume for a closed system."

This makes it clear that the interpretation in mainstream science is that this "fundamental thermodynamic relation" is for closed systems, implying that somehow that all the entropy change is manifested directly by the system. They have no concern for the possibility that entropy could flow into what they have instead considered as a "closed system" (which reads as "a system that is thermally isolated from the environment, in the sense of not being able to receive energy from the outside"). They've come to think that energy transfer is necessary for a transfer of entropy because they treat energy as the fundamental substance of matter. So to them, the entropy in these said "closed" systems arises solely due to the interactions of the energy already present, and of that, only the part which they know how to plug into the fundamental thermodynamic relation. Thus, they do not include entropy from external sources, nor even from subatomic particles themselves, for as of yet, the entropy of systems within atoms is not apparent to them. Additionally, they have not accepted the idea that entropy can flow into a system when that system is energetically isolated from the environment. Thus, it is no wonder that they think entropy can be created from nothing.Kmarinas86 (Expert Sectioneer of Wikipedia) 19+9+14 + karma = 19+9+14 + talk = 86 15:04, 27 January 2011 (UTC)

Inaccurate Statements

The article contains a number of inaccurate statements. Let me name a few:

"Historically, the concept of entropy evolved in order to explain why some processes are spontaneous and others are not..." Entropy, as defined by Clausius, is a conserved function in the Carnot cycle along with the internal energy. For efficiencies other than the Carnot efficiency the equality in the first formula in the section "Classical Thermodynamics" becomes an inequality.

"Thermodynamics is a non-conserved state function..." contradicts the first formula in the section "Classical Thermodynamics".

"For almost all practical purposes, this [Gibbs's formula for the entropy] can be taken as the fundamental definition of entropy since all other formulas for S can be derived from it, but not vice versa." This is definitely incorrect; just consider the entropy of degenerate gases which is proportional to the number of particles.Bernhlav (talk) 08:20, 2 April 2011 (UTC)


Hello and welcome to Wikipedia. I appreciate your help and encourage to fix the flaws in the article - be bold. However, a quick browse through our policies and guidelines would be handy first - I'll drop you a welcome on your personal talk page. Zakhalesh (talk) 08:23, 2 April 2011 (UTC)
As long as you can cite your corrections to reliable sources, please make them. It is unlikely someone else will if you don't. InverseHypercube (talk) 08:43, 2 April 2011 (UTC)

Hello,

Thank you. I do not know how to make the changes because that will remove references that are cited. Let me give you an example of the first of the above statements.

The concept of entropy arose from Clausius's study of the Carnot cycle [1]. In a Carnot cycle heat, Q_1, is absorbed from a 'hot' reservoir, isothermally at the higher temperature T_1$, and given up isothermally to a 'cold' reservoir, Q_2, at a lower temperature, T_2. Now according to Carnot's principle work can only be done when there is a drop in the temperature, and the work should be some function of the difference in temperature and the heat absorbed. Carnot did not distinguish between Q_1 and Q_2 since he was working under the hypothesis that caloric theory was valid and hence heat was conserved [2]. Through the efforts of Clausius and Kelvin, we know that the maximum work that can be done is the product of the Carnot efficiency and the heat absorbed at the hot reservoir: In order to derive the Carnot efficiency, , Kelvin had to evaluate the ratio of the work done to the heat absorbed in the isothermal expansion with the help of the Carnot-Clapeyron equation which contained an unknown function, known as the Carnot function. The fact that the Carnot function could be the temperature, measured from zero, was suggested by Joule in a letter to Kelvin, and this allowed Kelvin to establish his absolute temperature scale [3] We also know that the work is the difference in the heat absorbed at the hot reservoir and rejected at the cold one: Since the latter is valid over the entire cycle, this gave Clausius the hint that at each stage of the cycle work and heat would not be equal but rather their difference would be a state function that would vanish upon completion of the cycle. The state function was called the internal energy and it became the first law of thermodynamics [4]

Now equating the two expressions gives If we allow Q_2 to incorporate the algebraic sign, this becomes a sum and implies that there is a function of state which is conserved over a complete cycle. Clausius called this state function entropy. This is the second law of thermodynamics.

Then Clausius asked what would happen if there would be less work done than that predicted by Carnot's principle. The right-hand side of the first equation would be the upper bound of the work, which would now be converted into an inequality, When the second equation is used to express the work as a difference in heats, we get or So more heat is given off to the cold reservoir than in the Carnot cycle. If we denote the entropies by S_i=Q_i/T_i for the two states, then the above inequality can be written as a decrease in the entropy, The wasted heat implies that irreversible processes must have prevented the cycle from carrying out maximum work.

Statistical Thermodynamics

Of all the thermodynamic functions, entropy has the unique role of having one foot in the macroscopic world and the other in the microscopic world of atoms. According to Boltzmann, the entropy is a logarithmic measure of the number of micro-complexions that correspond to, or are indistinguishable from, a given macroscopic state. Since the latter is a very large number, instead of being a proper fraction, Planck referred to it as a 'thermodynamic' probability [5]. Boltzmann was always of the opinion that large number gave better assurance than mere fractions.

As an illustration of how Boltzmann determined his entropy let us consider the distribution of N particles in k urns [6]. The a priori probability of there being particles in the ith urn is . The are not all independent but must obey the condition , and since probability must be conserved, . Then the probability that the first urn contains , the second urn contains , and so on, is , which is known as the multinomial distribution[7].

Now Boltzmann did not know what the a priori probabilities were so he supposed that they were all equal, , for all the urns. According to Boltzmann we should not discriminate among the urns. [But when he attaches different energies to the different urns, he precisely does discriminate among them.] The probability P thus becomes a 'thermodynamic' probability whose logarithm, , he set proportional to the entropy S. The constant of proportionality, or the universal gas constant divided by Avogadro's number, for one mole of gas, was later determined by Planck in his study of black-body radiation, at the same time he derived his constant [8]. Thus, Boltzmann's assumption of a priori equal probabilities [9] converts the multinomial distribution into a thermodynamic probability, which, instead of being an actual probability, is a very big number.

Entropy is a measure of the error that we commit when we observe a value different from its true value which we know to be the average value, . When we specify which average we are considering, the entropy will be determined from Gauss's error law which equates the average value with the most probable value Cite error: A <ref> tag is missing the closing </ref> (see the help page).. This implies that must be the same function of the observations that is of the average value. And if we take , we obtain the probability distribution as which is the well-known Poisson distribution under the condition that Stirling's approximation for the factorial is valid.

If we consider the independent probabilities of the occupation of the k different urns we must consider the product of the probabilities where we must now attach an index on the average value to show that it is the average value for the ith urn. The product then gives the multinomial distribution, where the a priori probabilities are now seen to be . Thus we see that is the Gibbs entropy which ensures that is a true probability, less than one, based on its strict concavity, The strict concavity of the entropy is its defining property, and corresponds to the thermodynamic criteria of stability. Consequently, the selection of which average is the true value and Gauss's error law determine the form of the entropy. The entropy is thus said to be the potential for the law of error, and this completes Boltzmann's principle [10].


I think this is clearer than what is found in the text. The same holds true for the other points mentioned.Bernhlav (talk) 16:15, 3 April 2011 (UTC)

Thank you! If you provide citations, I will try to incorporate it into the article text. InverseHypercube 17:51, 3 April 2011 (UTC)
I can certainly provide you with references. But if I told you where I took it from would that be considered self-publicity by Wikipedia? I can do the other points I have mentioned, and others that I haven't.Bernhlav (talk) 09:34, 4 April 2011 (UTC)
No, it wouldn't be considered self-publicity, unless it linked to your own site, and even then if it was of an academic nature it wouldn't be a problem (see WP:IRS for how to identify good sources for Wikipedia). Please write all your points with references (and in an encyclopedic manner), and I'll see what I can do about adding them. Thanks for your contributions! InverseHypercube 22:58, 4 April 2011 (UTC)
I have added the references and have included part of the statistical thermodynamics, but I can't see it when I click "show preview".
Note that the "Gibbs entropy" is the fundamental definition of entropy, it also applies to degenerate gasses. Bernhlav (talk)
You forgot to put some closing reference tags; that's why you couldn't see it. Anyways, I fixed it now. The statistical thermodynamics section seems to be unsourced, so it can't be added as it is now (Wikipedia is very strict about citing sources). The first part seems all good, though, so I'll be adding it soon. Thanks again! InverseHypercube 23:34, 5 April 2011 (UTC)
I have added the references. There must be a mistake, I didn't say the Gibbs entropy applies to degenerate gases: it doesn't. Actually, this could have been the motivation for Planck's search for a logarithmic form of the entropy that would be valid in every frequency zone from to in his theory of black body radiation. Bernhlav (talk) 11:48, 6 April 2011 (UTC)
I wrote that. And it does apply, but you then have to take the probability distribution to be over the set of the entire N-particle states of the gas. If you work with the individual molecules, then it obviously does not work. In statistical physics, it is standard to let the "i" in "P_i" refer to the complete state of the gas, not to the state of a single molecule. Count Iblis (talk) 14:55, 6 April 2011 (UTC)
The entropy of a degenerate gas is proportional to N, the number of molecules which is not conserved, and not to the logarithm of it. Relativistically, it also follows that S and N are relativistic invariants, and are proportional to one another.

I don't understand "the complete state of the gas" as the problem concerns the combinatorics, i.e. how to distribute a certain number of particles among given states.Bernhlav (talk) 16:47, 6 April 2011 (UTC)

What you are doing is giving a treatment based on some assumptions that allows you to do statistics with the molecules themselves. This only gives the correct result in special cases. As you point out, you don't get the correct result in the case of e.g. degenerate gasses. However, within Statistical Mechanics, we can still treat degenerate gasses using the Gibbs formula for entropy, but this works by considering a ensemble of systems. So, you consider a set consisting of M copies of the entire gas of N molecules (you can also make the number of molecules variable and fix it later usingthe chemical potential, appearing as a Lagrange multiplier when you maximize the probability while fixing the average internal energy and average number of particles) you are looking at and the probability distribution is over the set of possible N-particle quantum states of the entire gas. Count Iblis (talk) 17:46, 6 April 2011 (UTC)
Either the particle number is constant and the chemical potential is a function of temperature or the particle number is a function of temperature and the chemical potential is constant, which can also be zero in the case of a boson gas. The constraint of constant internal energy on the maximization of the multinomial coefficient is contrary to the assumption that the a priori probabilities of occupation of the states are equal. The Gibbs entropy is a limiting form as the Renyi entropy clearly shows. If the Gibbs entropy would apply to degenerate gases, Planck wouldn't have bothered looking for another logarithmic entropy in each frequency interval. The Gibbs entropy corresponds to the Poisson distribution which is a limiting form of the negative binomial and binomial distributions. Hence it cannot account for FD or BE statistics.Bernhlav (talk) 07:09, 7 April 2011 (UTC)
You can simply put the whole system in a box and keep internal energy and particle number constant. Then, you deal with the mathematical difficulties of treating the Fermi gas, using varius tricks/approximations that become exact in the N to infinity limit. You then find the usual Fermi-Dirac distribution. This is not a surprise, because in the thermodynamic limit, the microcanonical ensemble is equivalent to the canonical and grand canonical ensembles. Count Iblis (talk) 14:16, 7 April 2011 (UTC)
There is a terrible confusion in the literature about a seeming equivalence of ensembles. Nothing of the sort, not even in the so-called thermodynamic limit. It is impossible to define temperature in the microcanonical ensemble, and any one who tries (and there have been many!) is wrong. Tricks or not, if the entropies are different for different probability distributions, which they are, not one entropy will give all three statistics. And since there are only two distributions (binomial and negative binomial) which tending to a limiting distribution (Poisson) in the high temperature limit, there is no such thing as intermediate, or para-,statistics.Bernhlav (talk) 16:14, 7 April 2011 (UTC)

If a system can be divided into two independent parts A and B, then

Thus the entropy is additive. The number of particles in the systems is also additive. If the substance is uniform, consisting of the same type of particles at the same temperature, then any portion of it can be thought of as a sum of identical pieces each containing the same number of particle and the same entropy. Thus the portions' entropy and particle number should be proportional to each other. So what is the problem? JRSpriggs (talk) 00:42, 15 April 2011 (UTC)

Just because two quantities are additive does not mean that they are proportional to one another! If you take an ideal gas, its entropy will be [eqn (42.7) Landau and Lifshitz, Statistical Physics, 1969]

where is some function of the temperature, and the prime means differentiation with respect to it. If I multiply all extensive variables by some constant, , then it is true that the entropy will be increased times, but the entropy has the form so that it is not a linear function of .

However, the entropy of black body radiation,

is proportional to [which is eqn (60.13) in the same reference] where is the radiation constant. The entropy has lost its logarithmic form. Moreover, the Renyi entropy

for all is also additive (and concave in the interval). It becomes the Gibbs-Shannon entropy in the limit so that the Gibbs-Shannon entropy is a limiting form of the Renyi entropy. Not to mention the non-additive entropies, to which the Havrda-Charvat entropy belongs,

which is also mentioned in the text under the name of the Tsallis entropy.

There is still the more fundamental reason why the Gibbs-Shannon entropy cannot be the most general form of the entropy: It corresponds to the Poisson distribution which is a limiting distribution in the limit where and such that their product is constant. So the Gibbs-Shannon entropy is not the most general form of entropy from which all others can be derived.Bernhlav(talk) —Preceding unsigned comment added by 95.247.255.15 (talk) 10:27, 17 April 2011 (UTC)

To 95.247.255.15: I was just trying to respond to the claim that degenerate gases were an exception. I did not say that additivity was sufficient by itself. I was talking about uniform substances which can be divided into many small identical parts.
If you want me to comment further on this, then please explain why your equation for the entropy of an ideal gas includes the factor N in the denominator inside the logarithm. JRSpriggs (talk) 21:42, 17 April 2011 (UTC)
Hi JRSpriggs: All you need are the equations of state PV=NRT and E=(3/2)NRT and the Gibbs-Duhem equation

can be integrated to give

where is a constant having dimensions of an entropy density. The exponents in the logarithm must be such that the entropy be an extensive quantity. This is explained in H. Callen, Thermodynamics, 2nd ed. pp. 67-68. The N in the denominator is nothing more than the inverse probabilities in the Gibbs-Shannon entropy,

Bernhlav (talk) 07:49, 18 April 2011 (UTC)

To Bernhlav: When varying N (quantity of substance), extensional quantities like E (energy) and V (volume) should not be held constant. Rather it is the ratios e=E/N and v=V/N which are constant. Thus your formula becomes

as required. JRSpriggs (talk) 10:06, 18 April 2011 (UTC)

To JRSpriggs: If you do that then you must also write and the particle number disappears completely! You can do the same by dividing all extensive variables by V. This is because the entropy is extensive.87.3.220.56 (talk) 06:08, 19 April 2011 (UTC)
To 87.3.220.56: Thanks for your support. Can I take it that you also agree with me that Gibbs' formula is the fundamental definition of entropy? JRSpriggs (talk) 08:40, 23 April 2011 (UTC)
The fundamental definition of thermodynamic entropy is given by the second law. It is entirely macroscopic, and has nothing to do with statistical mechanics. Statistical mechanics describes the entropy in microscopic terms, and develops a mathematical equation which is equivalent to the thermodynamic entropy. Statistical mechanics does not define nor redefine entropy, it reveals the underlying microscopic and statistical nature of the thermodynamic entropy. PAR (talk) 14:30, 23 April 2011 (UTC)
Thermodynamics defines entropy in terms of the extensive thermodynamic variables. It's statistical counterpart is defined in terms of probabilities. A complete set of probabilities is not a complete set of extensive variables. Isn't that a re-definition?Bernhlav (talk) 15:20, 23 April 2011 (UTC)
Entropy in statistical mechanics is an integral over a complete set of probabilities, which is not the same thing. Also, these probabilities are assumed to be equal for each microstate, not measured to be equal. The fact that the calculated statistical entropy can be identified with the thermodynamic entropy supports the assumption and the whole train of reasoning. Its like you are picking through possible microscopic explanations of thermodynamic entropy until you find the right one, and accept it as an explanation, but it is not a redefinition. If statistical mechanics were to yield an explanation of entropy which is at odds with the thermodynamic definition, statistical mechanics would be wrong, not thermodynamics. Thermodynamics and entropy in particular are based on macroscopic measurements. Statistical mechanics seeks to give a microscopic explanation of those measurements. In the realm of thermodynamics, statistical mechanics makes no measurements of its own, and therefore cannot operationally redefine any thermodynamic concept. PAR (talk) 18:10, 23 April 2011 (UTC)
You can still observe correlation functions, fluctuations etc. that are beyond the realm of thermodynamics. So, statistical mechanics is more fundamental than thermodynamics, you could compare it to the relation between quantum mechanics and classical mechanics. Count Iblis (talk) 18:41, 23 April 2011 (UTC)
The 2nd law can be proven using the Gibbs-Shannon/microscopic definition. Cannot be proven from classical theory. Ergo we adopt Gibbs-Shannon as fundamental. -- cheers, Michael C. Price talk 19:32, 23 April 2011 (UTC)
Count Iblis - absolutely true. Thats why I started the last sentence with "In the realm of thermodynamics...". Statistical mechanics is so successful in predicting and explaining the experimental facts and laws of thermodynamics that it may be considered a valid theory. But the realm of its applicability goes beyond thermodynamics, where it is again successful.
Michael C. Price - The second law in classical theory is primary: it is an experimental fact. If the Gibbs-Shannon definition failed to predict the second law, it would be wrong, not the second law. The fact that the Gibbs-Shannon definition implies the second law is proof that the assumptions made in the Gibbs-Shannon definition are valid, not that the second law is valid. In this sense, it does not provide a "proof" of the second law, it provides an explanation. PAR (talk) 20:55, 23 April 2011 (UTC)
I said I would add the information provided by Bernhlav to the article. Do any of you have revisions to make to it? Thanks. InverseHypercube 03:56, 24 April 2011 (UTC)
PAR - the 2nd law, using Gibbs-Shannon, follows from unitarity. -- cheers, Michael C. Price talk 05:29, 24 April 2011 (UTC)
Thus proving the validity of the assumptions made in the Gibbs-Shannon definition of entropy. Not proving the second law. PAR (talk) 15:34, 24 April 2011 (UTC)
What assumptions would those be? I don't see them. -- cheers, Michael C. Price talk 16:17, 24 April 2011 (UTC)

I think the most important one is the assumption that each microstate is equally probable (equal apriori probability). This is not as simple as it sounds, because for a thermodynamic process, in which things are changing, you will have to deal with the time element, i.e. how collisions produce and maintain this equal apriori probability. PAR (talk) 17:07, 24 April 2011 (UTC)

You don't have to assume that each microstate state is equally probable. You can derive that from dS = 0. And that is derived from unitarity. -- cheers, Michael C. Price talk 17:21, 24 April 2011 (UTC)
Could you give a short expansion on what you are saying or recommend a book or article that would expand on it? I'm having trouble following your argument. PAR (talk) 19:18, 24 April 2011 (UTC)
Equiprobability at equilibrium follows from varying Gibbs-Shannon S(P_i) and using conservation of probability to constrain the dP_i. dS vanishes when P_i = constant. (For the unitarity argument see Second_law_of_thermodynamics#General_derivation_from_unitarity_of_quantum_mechanics.)-- cheers, Michael C. Price talk 20:21, 24 April 2011 (UTC)
To PAR: You have made two incorrect statements: (1) "these probabilities are assumed to be equal for each microstate" This is not an assumption, it follows from Liouville's theorem (Hamiltonian). (2) "The second law ... is an experimental fact" No, it is only approximately true. I believe that experiments have been done which show very small systems changing from a higher entropy state to a lower entropy state. JRSpriggs (talk) 01:46, 25 April 2011 (UTC)
How do equiprobabilities follow from Liouville's theorem? -- cheers, Michael C. Price talk 02:19, 25 April 2011 (UTC)
At Liouville's theorem (Hamiltonian)#Symplectic geometry, it says "The theorem then states that the natural volume form on a symplectic manifold is invariant under the Hamiltonian flows.". In other words, the volume of a set of states in phase space does not change as time passes, merely its location and shape. So if we identify a micro-state as a certain volume in phase space (perhaps ), then each micro-state will have the same probability in the long run (assuming perturbations cause mixing across orbits). JRSpriggs (talk) 02:39, 25 April 2011 (UTC)
  • To Michael C. Price - ok, I will read up on this. I have never read the Everett thesis, but it looks excellent.
  • To JRSpriggs - Yes, the further you are from the thermodynamic limit (infinite number of particles) the more error you encounter in the direct statement of the second law. I always take the second law to mean "in the thermodynamic limit...", so that's the experimental fact I am referring to. If you do not take this meaning for the second law, then our disagreement is only semantic. Regarding Liouvilles theorem, I think your last statement "assuming perturbations..." is what I was referring to when I said "you have to deal with the time element...". Speaking classically, simply because the probability density of states in phase space remains constant (Liouville) without collisions does not imply that the probability becomes evenly distributed with collisions. PAR (talk) 03:15, 25 April 2011 (UTC)
  • To Michael C. Price - Reading the Everett thesis ([1]) - please tell me if you agree with this - define the information "entropy" of a pure wave function with respect to some operator as where Pi are the usual - modulus of the projections of the wave funtion on the eigenvectors of the operator. Then a measurement will collapse the wave function, yielding minimum (zero) entropy - one Pi will will be unity, the others zero. Unitary propagation of the wave function forward in time (using e.g. the Schroedinger equation) will cause these Pi to change, increasing the "entropy". This part is not obvious to me, but possible - For a system with many degrees of freedom, the "entropy" will increase until it reaches a maximum - i.e. all Pi are roughly equal (the larger the system, the less roughly). PAR (talk) 16:00, 25 April 2011 (UTC)
  • Everett's derivation of the increase in entropy (see pages 29/30, and c.127 in his thesis), doesn't rely on starting from a collapsed state. His proof is interpretation independent, so works (I think) whether you collapse the wf or not. It doesn't rely on the equiprobability assumption, which is something I just threw in because the topic surfaced here. -- cheers, Michael C. Price talk 16:22, 25 April 2011 (UTC)

I don't think that the traditional foundations of statistical mechanics are still being taken very serious. see e.g. here. The reason why statistical mechachanics works is still being vigorously debated. What I find interesting is the Eigenstate Thermalization Hypothesys (ETH), which basically boils down to assuming that a randomly chosen eigenstates of an isolated system with a large number of degrees of freedom will look like a thermal state. See here for an outline and here for a paper by Mark Srednicki on this topic. Count Iblis (talk) 15:44, 25 April 2011 (UTC)

The first (opening) paragraph is completely WRONG

"Entropy is a thermodynamic property that is a measure of the energy not available for useful work in a thermodynamic process, such as in energy conversion devices, engines, or machines. Such devices can only be driven by convertible energy, and have a theoretical maximum efficiency when converting energy to work. During this work entropy accumulates in the system, but has to be removed by dissipation in the form of waste heat"

The above first "Entropy" article(opening) paragraph is completely WRONG! Entropy is NOT “measure of the energy not available for useful work” since entropy is an equilibrium property and work is energy in transfer during a process. Actually, the higher system temperature the higher entropy (everything else the same) and more potential for work with reference to the same surroundings (so work is not property but process variable). The second sentence is confusing and ambiguous, thus not accurate. Maximum efficiency is NOT “when converting energy to work,” but if work is obtained in reversible processes (like ideal Carnot cycle). The last (third) sentence is confusing and plain WRONG: “During this(?) work entropy accumulates in the system” - the entropy does NOT accumulates (what system, where?) since entropy is not associated with work but with thermal energy per unit of absolute temperature. Actually, during ideal Carnot cycle (max work efficiency) the entropy is not generated, but conserved: entropy rate to the cycle is equal to the entropy rate out of the cycle. Not a good idea to start Entropy article with confusing and ambiguous, thus inaccurate statements. See entropy definition at: http://www.kostic.niu.edu/Fundamental_Nature_Laws_n_Definitions.htm#DEFINITION_of_ENTROPY —Preceding unsigned comment added by 24.14.178.97 (talk) 04:25, 14 February 2011 (UTC)

I am confused by the opening paragraph too. I agree with the above post. According to the stated definition of entropy, in a reversible thermodynamic process (0 change in entropy) 0% of the energy would not be available for useful work. So for a Carnot engine operating between two reservoirs at any two different temperatures (0 change in entropy) there would be 0% energy not available for useful work. That strikes me as being incorrect because that would mean the engine was 100% efficient: ie 100% of the input energy would be available for useful work, which, of course, is not even theoretically possible (unless Tc = Absolute zero).
Also: what is "convertible energy"? That is a term I am not familiar with. In thermodynamics there is heat flow (Q), internal energy (U) and work (W). Which of those three categories of energy does "convertible energy" fall into?
The author(s) of the opening paragraph appear to be confusing entropy with heat flow that is unavailable to do work. They are two very different things. As is correctly pointed out above, even in an idealized reversible thermodynamic process (entropy change = 0) there is heat flow that is unavailable to do work.
This is most unfortunate. How many thermodynamics students are being confused by this incorrect definition of Entropy?! AMSask (talk) 23:16, 11 April 2011 (UTC)
Count me as one. Worse yet, as someone that works with a lot of information and computer science, the definition of "randomness" used in the article is incredibly vague and useless, and impossible to relate to anything I already understand. "Random" to me means that multiple items in a series are mutually independent, so that any item cannot be used to determine the state of another item. That definition obviously has no meaning whatsoever in the context of the provided paragraph. Now I'm aware that there is a definition of entropy that relates to the number of available states in a system, and that "randomness" is sometimes used as a shorthand for that concept, and I believe that is where the author was trying to go, but he stops well short of clarity. To sum the problem up in a sentence, I think both the paragraph regarding "convertible energy" and the "randomness" bit can be replaced with the sentence "entropy is that thing, you know what it is, with the heat and the randomness and the time's arrow, yeah that" without any loss of information or clarity. Fortunately, I found this excellent write-up googling, which may be useful for other frustrated students looking at the talk page for the same reason I am, http://www.tim-thompson.com/entropy1.html -- Anonymous frustrated student. — Preceding unsigned comment added by 173.29.64.199 (talk) 21:26, 31 May 2011 (UTC)

I agree with this criticism, but a previous rewrite initiative to fix this failed, because most editors here want to edit from the classical thermodynamics perspective and then you can't give a good definition of entropy. Peraps we'll need to try again. Count Iblis (talk) 21:39, 31 May 2011 (UTC)

The disambiguation page for entropy shows there is a whole family of articles with the word entropy in the title. This article (Entropy) is the original one, and is still devoted almost exclusively to the classical thermodynamic application of entropy, even though there is a separate article titled Entropy (classical thermodynamics). The solution to the problem described above by Count Iblis would appear to be to convert Entropy into a brief introduction to the word to explain its many applications, and then direct readers to the various articles on entropy and allow the readers to choose which application they want to investigate. Everything in Entropy that is dedicated to classical thermodynamics should be merged into Entropy (classical thermodynamics) (or Entropy (statistical thermodynamics)).
Some examples of those various articles are:
Dolphin (t) 23:44, 31 May 2011 (UTC)

In general

The terms "in general" and "generally" are often ambiguous. For example, the section on The second law of thermodynamics begins:

"The second law of thermodynamics states that in general the total entropy of any system will not decrease other than by increasing the entropy of some other system." [italics added]

The last part of this section indicates (without reference from the above quote) an apparent exception:

"Statistical mechanics demonstrates that entropy is governed by probability, thus allowing for a decrease in disorder even in a closed system. Although this is possible, such an event has a small probability of occurring, making it unlikely. Even if such event were to occur, it would result in a transient decrease that would affect only a limited number of particles in the system."

However, it is not made clear whether this is the only exception.

Also, any exceptions noted here should be made harmonious with the Second law of thermodynamics article.

The terms "in general" and "generally" appear elsewhere in the article, but this is arguably the one most in need of clarification.

Dorkenergy (talk) 07:26, 4 July 2011 (UTC)

I agree that use of the terms in general and generally can better be understood by recognizing that they are cliches than by looking for any precise meaning they might convey. On Wikipedia, good writers provide good content and better writers improve the prose by removing cliches and other speed humps. There are many science-oriented topics in which a universal truth must be carefully qualified. For example, the principle of conservation of mechanical energy is a universal principle but it must be qualified by saying "providing the only forces at work are conservative forces." It would be unacceptable for Wikipedia to take a short cut and say "In general, mechanical energy is conserved." Wikipedia articles must say what they mean. There are no short cuts. Dolphin (t) 07:55, 4 July 2011 (UTC)
How unlikely does an event have to be before one is permitted to say that it can never happen? This is the kind of issue one will get into if one tries to be more specific than just saying that entropy generally does not decrease. JRSpriggs (talk) 08:23, 4 July 2011 (UTC)
Greater specificity is certainly required where there is substantial controversy over the implications and applications of the concepts. In this instance, the religious vs. scientific debate on implications of the Second Law require that greater specificity. Also, readers might reasonably question the relationship to virtual particles or other quantum mechanical considerations. Dorkenergy (talk) 14:06, 4 July 2011 (UTC)
I think you make some interesting points. Do you have a proposal for how the sentence using these terms can be reworded or what should be added to help clarify for the reader? Currently I think I understand what is meant, but I am certainly open to even better explanations. § Music Sorter § (talk) 23:17, 4 July 2011 (UTC)
If, in a system isolated from its environment, the entropy is S1 at time t1 and S2 < S1 and t1 < t2, then the probability of finding the system to have entropy S2 at time t2 is less than or equal to where k is Boltzmann's constant (by my OR). Given how small Boltzmann's constant is, this probability might as well be zero for any noticeable reduction in entropy. JRSpriggs (talk) 08:33, 5 July 2011 (UTC)

It is impossible for us to cause a decrease in total entropy. However, very small decreases occur spontaneously, but very rarely and very briefly. If the system is not already at equilibrium, this tendency would be overwhelmed by the increase in entropy due to dissipation of free energy. Any attempt to use the spontaneous decreases to do useful work is doomed to failure because more work would be expended to identify and capture them than could thereby be generated. JRSpriggs (talk) 09:07, 14 July 2011 (UTC)

I now see that we have an article on this subject. It is fluctuation theorem. JRSpriggs (talk) 20:59, 25 August 2011 (UTC)
Also interesting. You can consider large fluctuations to a lower entropy state and then ask what the most likely time evolution that leads to such a fluctuation looks like. The answer is rather simple, it's simply the time reversal of the way the system would relax back to equilibrium, starting from the low entropy state. For large objects, say a house, that approach equilibrium by slowly falling apart bit by bit, you are led to conclude that the fluctuation that leads to it reappear, consists of a large number of fluctuations that conspire to build it up from dust. Count Iblis (talk) 01:32, 26 August 2011 (UTC)

Kinetic energy of a piston is not thermal energy. Thermal energy is not conserved.

"Thus, entropy is also a measure of the tendency of a process, such as a chemical reaction, to be entropically favored, or to proceed in a particular direction. It determines that thermal energy always flows spontaneously from regions of higher temperature to regions of lower temperature,"

The reality is that energy does not always remain thermal. In fact, a car internal combustion engine converts some amount of thermal energy into vehicle motion. In no sense can a temperature be defined for a vehicle as function of a vehicle speed, tire speed, crankshaft speed, or any other speed of any rotating and/or reciprocating car part. It doesn't take long to realize that energy that leaves the system or any energy that remains in the system in some form other than thermal energy will contribute to TdS. All TdS is somehow accounted for in this way. I italicized the latter to implicate the fact that not all TdS constitutes unusable "waste heat". The quantity TdS also encapsulates thermal energy that was lost due to expansion itself (which is in fact a mechanism capable of expansion work that requires no heat transfer whatsoever from any reservoir to any other reservoir for thermal energy to be lost; look it up at adiabatic process). Recovering regeneratively any work done (kinetic energy of pistons, vehicles, etc.), such as by running electric motors in reverse (e.g. regenerative braking) can reduce the rate at which entropy increases. This non-thermal energy recovered can indeed spontaneously flow from a lower to a higher temperature, and this is because which direction non-thermal energy flows is not principally determined by temperature, but rather by inertia as well as external forces to the body, which are largely unrelated to temperature, such as the force that pulls a cannonball to the Earth. Entropy generally rises because recovery of kinetic energy that was previously in the form of thermal energy derived from waste heat only supplies a tiny fraction of our energy resources. However, the idea that energy in general cannot flow from cold to hot spontaneously is clearly flawed. A windmill milling grain is a clear example of this fact, wherein only non-thermal energy plays a role as far as work done is concerned. Ask yourself, "What temperature of wind and what temperature of grain is relevant to the mere fact that a windmill operates?" That limitation only exists for thermal energy. And there are some exceptions to the rule concerning thermal energy: (Example: Shine a warm light onto a much brighter light more so than the other way around by simply inserting a wavelength-specific filter in between, and though such an effect is of limited utility at this time, it does constitute an example where thermal energy can flow spontaneously from cold to hot. Again it's unlikely, but it's still spontaneous.)siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk
23:19, 22 August 2011 (UTC)

Entropy as an apparent effect of conservation of information.

If entropy is considered an equilibrium property as in energy physics, then it conflicts with the conservation of information. But the second law of thermodynamics may simply be a apparent effect of the conservation of information, that is, entropy is really the amount of information it takes to describe a system, each reaction creates new information but information cannot be destroyed. That means the second law of thermodynamics is not an independent law of physics at all, but just an apparent effect of the fact that information can be created but not destroyed. The arrow of time is thus not about destruction, but about the continuous creation of information. This explains how the same laws of physics can cause self-organization. Organized systems are not anyhow less chaotic than non-organized systems at all, and the spill heat life produces can, in an information-physical sense, be considered beings eliminated by evolution rather than a step towards equilibrium. It is possible that overload of information will cause the arrow of time to extract vacuum energy into usefulness rather than heat death. 217.28.207.226 (talk) 10:49, 23 August 2011 (UTC)Martin J Sallberg

You could reinterpret Loschmidt's paradox to say that the total information entropy of, lets say, a gas, must remain constant due to time reversal symmetry. The paradox is that the thermodynamic entropy increases. The paradox is resolved by realizing that the total information entropy is equal to the marginal entropy (proportional to the thermodynamic entropy) in which correlations in velocity and position between particles is ignored, plus the mutual entropy (the entropy due to correlations), the sum of which is constant - as the thermodynamic entropy increases, the marginal entropy decreases. So there is no creation of information entropy, and total information entropy is conserved. See Mutual information#Applications of mutual information. PAR (talk) 15:07, 23 August 2011 (UTC)

I actually meant that each state and relative position the particles or waves or strings or whatever has been in is information, so the total amount of information increases overtime since information can never be destroyed. You falsely confused it with "information entropy", what I mean is that entropy is not a independent property at all, but just an apparent effect of the fact that information is created but never destroyed. 95.209.181.217 (talk) 18:28, 23 August 2011 (UTC)Martin J Sallberg

psychological entropy is bullshit — Preceding unsigned comment added by 68.51.78.62 (talk) 01:42, 23 November 2011 (UTC)

Energy not available for work!?

This is nonsense. The Wikipedia article on energy defines it as "the ability a physical system has to do work on other physical systems". Therefore by definition, something which is not available for work is not energy.

Gcsnelgar (talk) 23:39, 22 December 2011 (UTC)

You are not quoting the article accurately. What it actually says is that entropy can be used to determine the energy not available for work in a thermodynamic process
It is true that the article used to contain the inaccuracy you described but that inaccuracy was criticised long and loud - see The first (opening) paragraph is completely WRONG. Despite the criticism nothing happened so I eventually changed it myself. See my diff.Dolphin (t) 01:02, 23 December 2011 (UTC)

Area is not Entropy

Entropy is necessarily a concave function of the extensive variables. Saying that entropy is proportional to the area is the same as saying that it is a convex function of the (Schwarzschild) radius, which is proportional to the mass. The putative second law leads to incorrect inequalities (c.f. arXiv:1110.5322). As was brought out in the black hole thermodynamics article, black hole 'evaporation' would lead to a violation of the supposed second law which says that the area of a black hole can only increase. In fact, black body radiation cannot be used as a mechanism for black hole evaporation since a heat source is necessary to keep the walls of the cavity at a given temperature which produces thermal radiation. Thermal radiation is an equilibrium process, as Einstein showed back in 1917, and does not lead to irreversible processes as those that would occur in processes related to evaporation.

According to black hole thermodynamics, the temperature would decrease with energy, and this does violate the second law (cf. http://www.youtube.com/watch?v=dpzbDfqcZSw).

Since when does a personal opinion constitute a scientific publication? which, as noted, is not even referenced. This section, together with reference 60, should be deleted from an otherwise respectable wikipedia articleBernhlav (talk) 23:01, 28 December 2011 (UTC)bernhlav.

Additions to Entropy and life

Entropy and life

For nearly a century and a half, beginning with Clausius' 1863 memoir "On the Concentration of Rays of Heat and Light, and on the Limits of its Action", much writing and research has been devoted to the relationship between thermodynamic entropy and the evolution of life. The argument that life feeds on negative entropy or negentropy was asserted by physicist Erwin Schrödinger in a 1944 book What is Life?. He posed: “How does the living organism avoid decay?”The obvious answer is: “By eating, drinking, breathing and (in the case of plants) assimilating.” Recent writings have used the concept of Gibbs free energy to elaborate on this issue.[11] While energy from nutrients is necessary to sustain an organism’s order, there is also the Schrӧdinger prescience: “An organism’s astonishing gift of concentrating a stream of order on itself and thus escaping the decay into atomic chaos – of drinking orderliness from a suitable environment – seems to be connected with the presence of the aperiodic solids…What is Life?. We now know that the ‘aperiodic’ crystal is DNA and that the irregular arrangement is a form of information. “The DNA in the cell nucleus contains the master copy of the software, in duplicate. This software seems to control by. ”specifying an algorithm, or set of instructions, for creating and maintaining the entire organism containing the cell.”[12] DNA and other macromolecules determine an organism’s life cycle: birth, growth, maturity, decline, and death. Nutrition is necessary but not sufficient to account for growth in size as genetics is the governing factor. At some point, organisms normally decline and die even while remaining in environments that contain sufficient nutrients to sustain life. The controlling factor must be internal and not nutrients or sunlight acting as causal exogenous variables. Organisms inherit the ability to create unique and complex biological structures; it is unlikely for those capabilities to be reinvented or be taught each generation. Therefore DNA must be operative as the prime cause in this characteristic as well. Applying Boltzmann’s perspective of the second law, the change of state from a more probable, less ordered and high entropy arrangement to one of less probability, more order, and lower entropy seen in biological ordering calls for a function like that known of DNA. DNA’s apparent information processing function provides a resolution of the paradox posed by life and the entropy requirement of the second law.. [13]LEBOLTZMANN2 (talk) 15:51, 9 February 2012 (UTC)

Entropy is not Disorder!

Strenuous efforts are being made to correct the assumption that entropy equates to a degree of order (even in Schrodinger, as quoted above). Yet this article directly perpetuates that error (search 'order' in the main article). As WP is, for better or worse, likely to be the most-read source on entropy for non-physicists/chemists, can we please address this issue and assist the efforts to correct this popular misconception? Rather than being perpetuated in the article, I feel a paragraph directly addressing the misconception would be in order. I don't feel suitably qualified to make the correction myself, but a fundamental observation would be that the reagents in an exothermic chemical reaction are NOT (necessarily) more 'ordered' than the products + the heat energy emitted. Likewise, the heat-emitting crystallisation (increase in 'order') taking place in certain brands of hand-warmer is not likely to be a violation of the 2nd Law of Thermodynamics! Allangmiller (talk) 09:10, 29 April 2012 (UTC)

Entropy as disorder is a common way to attempt to impart an intuitive understanding of entropy. The problem is "what is the definition of order and disorder"? This is never clearly explained when making the analogy, so the whole idea is rather vague, but useful as a start, realizing that eventually it falls apart. I agree, a discussion of the concept and its failings are in order, and your counterexamples would be useful in the "falls apart" section. I don't think we should declare it to be total trash, just a stepping stone to a clearer intuitive understanding of entropy. PAR (talk) 17:17, 29 April 2012 (UTC)
In the section entitled "Approaches to understanding entropy" the first sub-section is indeed called "Order and Disorder" - but the subsections which follow after that, do try and explain entropy in other terms. --DLMcN (talk) 17:55, 29 April 2012 (UTC)

Difficulties to understand

I can't understand what entropy is from the article. I suggest that the article begins with a definition. What it begins with now I understand as a statement what entropy can be used for, and not a definition.

Then there are multiple definitions further down but after reading them I still unfortunately have no idea what entropy is. — Preceding unsigned comment added by 77.58.143.68 (talk) 14:08, 30 April 2012 (UTC)

I agree, but giving the only correct definition (i.e. that entropy is the amount of information needed to specify the exact physisical state the system is in, given its macroscopic specification), has till now been found unacceptable by many of the editors here. Count Iblis (talk) 16:04, 30 April 2012 (UTC)
Count Iblis, if that's true, then it's sad. To be fair, though, its not that simple. There is the artificiality of measuring temperature in units other than energy, which gives rise to the artificial constant of proportionality called Boltzmann's constant. This is not easily grasped by a new reader, so I don't think the new reader should be hit with that from the beginning. To the original question, entropy and its meaning in the context of thermodynamics is not easy to grasp intuitively. I've been studying it off and on for years, and I still don't completely get it. Taken alone, as information entropy, its easier to understand, but its the connection between information and thermodynamics that is mathematically clear, but not intuitively clear to me. Yet. PAR (talk) 00:19, 1 May 2012 (UTC)
Count Iblis and I have commented on these criticisms several times in the past. My most recent suggestions were about 11 months ago - see my diff. This is from a thread that is now available at Talk:Entropy/Archive11#The first (opening) paragraph is completely WRONG. Dolphin (t) 01:18, 1 May 2012 (UTC)

Decrease in "disorder" even in closed system

An example in nature would be helpful to understanding this concept better: "Statistical mechanics demonstrates that entropy is governed by probability, thus allowing for a decrease in disorder even in a closed system. Although this is possible, such an event has a small probability of occurring, making it unlikely. Even if such an event were to occur, it would result in a transient decrease that would affect only a limited number of particles in the system." Thanks165.212.189.187 (talk) 18:56, 23 July 2012 (UTC)

The second sentence is incorrect and I have removed it from the article. Entropy-decreasing events are more likely to affect a limited number of particles, but it is not impossible for a system-wide entropy-reducing event to occur. The Fluctuation theorem quantifies the occurence of entropy-reducing events. An example would be statistical pressure fluctuations in a gas. Maximum entropy would imply a constant pressure. PAR (talk) 01:59, 15 September 2012 (UTC)

I don't like how it compares entropy to time. Time CAN theoretically move in more than one direction - like space - according to Einstein. Entropy can't.

2.216.69.181 (talk) 09:43, 21 July 2012 (UTC)

Evolution can be seen as reverse entropy. Patrickwooldridge (talk) 23:45, 13 September 2012 (UTC) ... Yes, an interesting point and suggestion, although I would be inclined to describe the development of Life (and its evolution) as a type of "compensation" rather than a reversal. --DLMcN (talk) 05:39, 14 September 2012 (UTC)

I was using the definition of "reverse" as "opposite, primarily in direction" to refer to the overall physical, thermodynamic process of evolution (and here I should qualify that I do not credit "natural selection" as the driving factor in evolution), rather than stating that a previously entropic process was reversing itself. Life appears to be simultaneously governed by entropy and the counter-entropic principle of evolution. I am curious why you suggest the term "compensation". Also, as I understand it, Professor Einstein asserted that time can be reversed in most equations without altering the outcomes, but not that time can be reversed in the observable universe. Patrickwooldridge (talk) 22:53, 14 September 2012 (UTC) ...> If we focus on heat transfer as the principal feature defining entropy, then we have to acknowledge that the evolution of life (and the development of progressively more complex forms) does not really involve heat exchange (or perhaps I am wrong? - I will give the matter more thought!) In any event, that was why "compensation" seemed to me to be an appropriate word. If, on the other hand, we focus on the "increase in disorder" which is sometimes associated with entropy, then maybe we could argue that the unfolding of new Life-forms is a 'flow in the other direction'? - could we? --DLMcN (talk) 15:23, 15 September 2012 (UTC)
You can actually think of heat transfer and increase in disorder to be the same thing. As an example, think of the Sun, which is the main energy source for nearly all of known life. Heat energy is dispersed from the Sun in the form of radiation. If you can picture a single moment of this radiation as a sphere of photons, growing in size as it leaves the Sun, you can see that the energy goes from a state of highly concentrated photons to a less concentrated state very quickly. This transmission of heat can be seen as going from a highly ordered state to a more and more disordered state. When this shell of energy reaches the Earth, a tiny fraction of it is absorbed by the cooler Earth, where it is then dispersed again, from hotter areas to cooler areas, or as longer wavelength radiation (IR) back out into space.
Now, if, say, a birch tree happens to be growing on the surface of the Earth, its leaves will absorb a small fraction of the Sun's energy, and that energy is turned into sugar. The heat energy has gone from a dispersing, disordered state, and has been locked into an ordered state. This can be seen as a reversal of entropy. However, this energy that is now in the form of sugar is energy that is no longer being used to heat the planet. Therefore, the Earth itself is now partially shaded, providing a cooler area for the rest of the its heat to go. The entropy of Earth is increased as a direct result of the decrease in entropy caused by the plant, so that the balance demanded by tha laws of thermodynamics is conserved. Zaereth (talk) 20:07, 15 September 2012 (UTC)
The main increase of entropy is in the sun. The nuclear reactions in the sun result in an increase of entropy there, more than offsetting any entropy decrease due to the growth of life. PAR (talk) 20:22, 15 September 2012 (UTC)
Thanks ... If, however, we are looking at the birth of a completely new life-form, i.e., a new species - then we would probably need to admit that we do not really know enough about the mechanisms involved, to be able to comment on what heat-transfer is involved? --DLMcN (talk) 21:23, 15 September 2012 (UTC)
All of this is highly theoretical, and it is important to note that many other theories exist, with the main constant be the laws of thermodynamics. I tend to cling to the theory that evolution occurs due to a need to adapt to the constantly increasing entropy, that is, to the ever changing universe in which we live. Here's where you have to bring DNA into the discussion, and things get extremely complex. DNA used to be thought of as static, and unchanging, and that evolution occurs due to anomolies, or "happy accidents," which occur in combining the different DNAs of the parents. Some have theorized that inbreeding has played a crucial role in this, and the huge variation in dogs, which were created by selective breeding by humans, seems to support this. A researcher in Hawaii came up with a similar conclusion, studying isolated patches of forest that had been cut off from each other by lava flows, similar to Darwin's finches, isolated on different, small islands. However, new research is emerging which seems to indicate that DNA may be more plastic than we had previously thought; more like a computer chip, which is full of switches that can be turned on and off, and that, perhaps, there is some conscious need also involved with evolution. In other words, how were those finches able to evolve such a variety of sophisticated, highly specialized beaks? Was it a happy accident, or was some other conscious need involved. And now we get into theory of consciousness, and are getting way off topic for this article. So, the simple answer is, yes, there are definitely a lot more unkowns than knowns. Zaereth (talk) 01:09, 16 September 2012 (UTC)
Schrodinger (among many others) also shared this view, although explained it as merely a local reversal in direction, like an eddie current in a river. According to him, life itself depends on an increase in entropy in the surrounding universe, so, while it may be reversed locally, the overall entropy of the universe is thusly increased to avoid any breaking of the second law of thermodynamics. You may find this book interesting, Evolution as Entropy By Daniel R. Brooks, E. O. Wiley, which seems to be available on google books. Zaereth (talk) 23:20, 14 September 2012 (UTC)
Yes, its important to realize that life is not "counter entropic" and that the second law is not violated. In physics there are "deterministic" equations which are reversible in time, and there are statistical equations which may not be. Statistics means you are dealing with a situation where you do not have full information about the system, and entropy is ultimately a concept introduced to quantify that missing information and how to deal with it. If you have a collection of particles (e.g. a gas) and you have full information about the gas, then you can use deterministic equations to describe its behavior in time. Entropy never enters the picture, and the equations are reversible. If you describe the gas using less than full information (e.g. using only temperature, pressure, volume, etc.), then you have to describe the gas statistically. Entropy is then introduced as a concept to deal with this missing information, and the resulting equations are not time-reversible. (This is a "classical" description - when a quantum description is introduced, the situation becomes more subtle, but does not contradict the classical description when it is applicable) PAR (talk) 01:51, 15 September 2012 (UTC)

This is a great discussion. I would encourage everyone reading or participating to rigorously distinguish between "Life" (whether individual organisms, species, or life in general) and "Evolution" (the physical force or principle which has led life from simpler forms to more complex, more ordered forms). Patrickwooldridge (talk) 23:58, 23 September 2012 (UTC)

"Entropy" for general readers

I don't want to disturb what appears to be a stable article of long standing, so I will make a suggestion for an addition to the lead section here first. The lead should be optimally accessable to a general reader. Entropy can be explained to a general reader, even in elementary school, with examples that require no technical or semi-technical language. I first heard about it with an example of a baloon full of smoke popping, whereby the gas spread out and got less well ordered. "Don't cry over spilled milk" and "you can't take the cream out of the coffee" both means that there is a direction in time by which you can't put the milk back, because of entropy, or in more extreme perspectives, that the direction of time is entropy (lot's of WP:RS on that, Reichenbach, Sklar, etc.). Putting WP:MOS Plain English examples like this in the lead would allow a general reader to quickly get the idea, but would change the technical tone of he existing stable article. I will add something like it unless there is an objection here, and put it up top because of MOS reasoning on the most accessable stuff being up top. 64.134.221.228 (talk) 12:56, 16 September 2012 (UTC)

First sentence

Are people opposed to editing the first sentence. It looks like it has been stable for a long time, but, as far as I can tell, it's not even gramattically correct. It doesn't really say anything. Any thoughts? Sirsparksalot (talk) 14:38, 18 September 2012 (UTC)

Personally, I agree. The sentence is very busy, with far too many slashes, and it requires too much prior knowledge of the terminology to be useful to the average reader. I think expanding it to become an opening paragraph of its own could be helpful. Here's how it reads now:
Entropy is the thermodynamic property toward equilibrium/average/homogenization/dissipation: hotter, more dynamic areas of a system lose heat/energy while cooler areas (e.g., space) get warmer / gain energy; molecules of a solvent or gas tend to evenly distribute; material objects wear out; organisms die; the universe is cooling down.
Here's what I would probably change to make it clearer:
Entropy is the thermodynamic property toward equilibrium. Entropy is the property that produces average, creates homogenization, and causes dissipation. Entropy arises from the second law of thermodynamics, which says that uneven temperatures will attempt to equalize. Although, according to the first law of thermodynamics, energy cannot be created or destroyed, the hotter, more dynamic areas of a system will lose heat or energy, causing cooler areas to get warmer or gain energy. Examples of entropy include thermal conduction, in which heat transmits from hotter areas to cooler areas, until thermal equilibrium is reached. It also includes diffusion, where molecules of solute in a solvent, or mixtures of gases or liquids, tend to evenly and irreversibly distribute. Material objects wear out, organisms die, and the universe is cooling down.
Does that seem to read any better? Zaereth (talk) 20:18, 19 September 2012 (UTC)
Well done Zaereth ! ... One minor typo: "causing cooler areas to get warmer" [or does your original conform with the rules of standard American English?] ... And a more significant point: perhaps replace your "and the universe is cooling down" [at the very end] with a reference to Wikipedia's Heat Death of the Universe. I know what you mean, but as you know, as long as hydrogen atoms still exist, they have the potential to fuse into helium (and so on up the scale) - releasing more heat, which (by the First Law) must go somewhere, even if some of it ends up locked inside black holes? ... (which is, in any event, a somewhat complicated concept?) --DLMcN (talk) 09:39, 20 September 2012 (UTC)
Yes, that was a typo. (I'm famous for my typos, "fat-fingers," and dyslexisms.) Thanks for catching that. I really like this type of collaboration, because ... you can actually look at the building of an encyclopedia as a reversal of entropy. As Dr. Ian Malcolm (Jurassic Park) said, "Life will find a way." I corrected that typo, so thanks again.
I would love to discuss black holes, and gravity in particular, but this isn't really the place for it. Black holes also seem to defy entropy, and many theories exist about that as well. I have no objection to adding a link to heat-death of the universe. However, unlike thermal conduction or diffusion, (which are knowns), heat death is still a theory, one which the theory of dark-energy may topple. Perhaps if we add a clause at the end of the sentence ... something like, "...and the universe is cooling down, possibly leading to the heat-death of the universe." That way we can present the view without necessarily endorsing it. Does that sound ok? Zaereth (talk) 13:08, 20 September 2012 (UTC)
But on average (i.e., taking it as a whole) I do not think we can really say that the universe is 'cooling down'. Individual stars, yes - as they become White Dwarfs and then burn out completely - but the heat which they progressively lose must go elsewhere. --DLMcN (talk) 13:46, 20 September 2012 (UTC)
When talking about the universe cooling, we're really referring to the cosmic microwave background radiation. In the beginning, this blackbody radiation was supposedly very hot and energetic, like visible light or even shorter wavelengths. As the universe spreads farther and farther into infinity, it becomes more and more spread-out and dispersed, which equates to cooling. Now this background radiation has strestched from very hot light into microwaves, which indicates that the universe itself is within just a few degrees of absolute zero.Zaereth (talk) 17:27, 20 September 2012 (UTC)
Let's give this a cursory once-over:

Entropy is the thermodynamic property toward equilibrium.

  • That isn't even English. Words need to be added if it is to have even the potential to be meaningful.

Entropy is the property that produces average, creates homogenization, and causes dissipation.

  • Incorrect. The cumulative effect of umpteen-many random quantum state jumps (if one is describing reality using an approximation which allows them) is what tends to create homogenisation and cause dissipation. Entropy is (or rather can be) a measure of that homogenisation, or extent to which dissipation has occurred.
  • "Average" is a statistical measure used by analysts of the system. It is not something "produced" by entropy.

Entropy arises from the second law of thermodynamics,...

  • Err, no. The Second Law is a law about entropy. But entropy can sometimes still be defined in circumstances where the Second Law is hard to apply.

... which says that uneven temperatures will attempt to equalize.

  • That is not a standard statement of the Second Law. It would be hard to apply the statement, for example, to a domestic refrigerator -- a standard case for the application of the Second Law.

Examples of entropy include...

  • The examples which follow are all examples of entropy increase. They are not examples of entropy, which is a measure of a particular aspect of a system, or of part of a system, at a moment in time.
Sorry if it may seem as if I'm being pedantic, but part of the challenge of the lead section of an article is that above all that section needs to be clear and to be accurate. Jheald (talk) 14:56, 20 September 2012 (UTC)
... a challenge which I do not disagree that the present lead fails badly too. Jheald (talk) 15:00, 20 September 2012 (UTC)
I agree that it's not perfect, and there ae many challenges in defining entropy. I also agree that the lede should be clear and accurate. However, the lede, by nature, is going to be somewhat vague and simplistic. Most of what is written there now adheres to the commonly accepted definition that entropy is: "The tendency for all matter and energy in the universe to evolve toward a state of inert uniformity." One great problem is that there are many definitions of entropy, depending on what science you prefer. Macroscopic entropy is defined somewhat differently from microscopic entropy. Some consider it to be a mysterious driving force, while others consider it to be a measure of that mysterious force. Some scientists believe that there will never be a satisfactory definition, but I don't think the lede is the place for all of that confusion. Here is where we need to start with a certain, albeit vague, definition, so that the inexperienced reader will have some sort of context for understanding the body of the text. So, my question to you is, what would you change to improve the first sentence? Zaereth (talk) 16:50, 20 September 2012 (UTC)
Part of what people want from WP is to cut through vagueness and mystery -- where its uninformed or unnecessary -- and instead get a good steer on what the scientific understanding of the term actually is.
So to start with, entropy is not a "force" or a "tendency". It is a measure of a particular aspect of a system, at a particular moment in time.
It is the universe which shows a tendency towards increased uniformity (for reasons which actually can be quite readily understood). Entropy is a numerical measure, that can be calculated, that quantifies that uniformity.
Straightening this confusion out is the most pressing thing that needs to be fixed in the proposed text.
More precisely, entropy quantifies the number of microscopic states of the system which are compatible with a particular macroscopic description. More uniform macroscopic states are associated with larger numbers of microscopic states. As the system gets jostled about, and gets bounced about from its initial microscopic state to "nearby" microscopic states, it is more likely to find itself in a macroscopic state associated with more microscopic states than fewer microscopic states. So the effect of that random evolution is to move the system to a state which macroscopically seems more homogeneous, closer to uniformity -- and makes any spontaneous move to a macroscopic state of less uniformity very unlikely. That's the heart of the idea of entropy, which may be worth setting out in the paragraph immediately after the fold -- with citations to informal introductions setting out the concept in this way. Jheald (talk) 17:33, 20 September 2012 (UTC)
I've occasionally offered re-writes to clarify bits of the lead in the past, eg diff, or further back this, which I still think is quite a nice balanced intro. They mostly haven't stuck, so if it's all right with you, I'm quite happy to let somebody else have a go, and just try to offer encouragement and what are intended to be constructive comments from the sidelines. Jheald (talk) 17:46, 20 September 2012 (UTC)
Yes, but that type of statistcal interpretation will be very difficult for the newcomer to understand. The actuality is that entropy is a mathematical model which serves to describe some property of a system which cannot be expressed simply in some other way. For example, here is a quote from the book The Entropy Principle: Thermodynamics for the Unsatisfied, by André Thess: "A simple example shows us this is not the case. When a stone falls into a well, the the energy of the water increases by the same amount as the decrease in potential energy of the stone. Can this process run spontaneously in the reverse direction? It would certainly not violate the first law of thermodynamics if the water would spontaneously cool down a little bit and conspire to throw the stone into the sky. Nevertheless, our experience shows us that such a process doesn't ever occur. Our physical intuition suggests that "something" has been lost in the system consisting of the stone and the water. In what follows we will see that this "something" can be accurately described in mathematical terms and leads us to the concept of entropy."
The book Entropy and Information Theory, gives yet another definition, and Thermal Engineering yet another. Chaos theory has yet another use for entropy. And then we come to "entropic forces", such as osmotic pressure or depletion. In considering all of this, we still have the problem of presenting the information in the best way that can be understood by a general audience. This is typically in a non-linear fashon, with the simplistic, vague, elementary school definitions coming first, and then expanding, from there on, in a pyramid fashon. You can find this spelled out in books like On writing well: The classical guide to writing non-fiction or Reading and writing non-fiction genres. So how can we incorporate all of this into one sentence, which will give an accurate definition of entropy? Zaereth (talk) 18:41, 20 September 2012 (UTC)
Note that a lead isn't necessarily meant to be an introduction. Per WP:LEDE, it is supposed to be a summarisation of the article -- how much of the content of the article can you summarise, restricted to four paragraphs (eg for Google, or someone else that just reproduces the lead, throwing all the rest of the article away). The article should certainly build up in as introductory and accessible a way as possible, but the place for that introduction to start is below the fold. The lead is supposed to summarise the whole article.
With regard to entropy, there is long-standing pedagogical discussion as to whether it is better to start from a macroscopic basis -- since the macroscopic idea of entropy can be developed entirely in macroscopic terms, with a microscopic picture only added later; or whether it is better to start from the microscopic picture, being more explicit, concrete and tangible, and therefore leading to a much more physically motivated, understandable sense of what is going on, and why this is how things work. Myself, I would tend to the latter; or at any rate bringing up the microscopic picture very quickly to explain why the macroscopic state function has the properties it does.
The macroscopic and microscopic views of entropy are fairly compatible. The chaos theory use of the term is rather different, and doesn't have much overlap. "Entropic force" follows fairly quickly given a good grasp of thermodynamic entropy, being the derivative of the free energy with respect to a parameter, the derivative being dominated by changes in the entropy-containing term in the free energy. Jheald (talk) 19:12, 20 September 2012 (UTC)
Fair enough. However, this is not about the lede; it's about the first sentence. The question it needs to answer is: What is entropy? Forget, if you will, my attempt to expand upon what the sentence already says (in order to make it readable) just look at that way it's currently written. Now, imagine you're in front of an audience of people ranging from elementary-school students to college students to senior citizens; some know all about science and others know nothing. Now imagine that you've been assigned the task of defining entropy, but must do it in only one sentence. What would you say? Zaereth (talk) 20:31, 20 September 2012 (UTC)
(Personally, in that position I'd say FU, I'm not gettin' paid enough for this. :-) But seriously, this is the most important sentence; the sentence which determines whether the majority of an audience "reads on or just moves on." Does anybody have any ideas, because, as it reads now, it's borderline gibberish. As Jheald pointed out, it's not even a complete sentence. Anybody? "Entropy is..." Zaereth (talk) 01:18, 21 September 2012 (UTC)
I've had a look at the diffs, but would rather see a simpler definition in the lede, saving the actual math for the body. I do appreciate your help with this difficult problem. Perhaps someone else will come along and help us out. Zaereth (talk) 18:41, 20 September 2012 (UTC)

The only correct definition is this: "The entropy of a system is the amount of information needed to specify the exact physical state it is in." You can then decide in what units to measure this information, which defines the prefactor k in S = k Log(Omega). But it should be clear that if a system can be in Omega number of states, that Log(Omega) is proportional to the number of bits of information needed to point out exactly which if the Omega states the system actually is in.

With this fundamental definition of entropy it is also easy to make contact with thermodynamics, and then you also have a rigorous definition of heat, work and temperature in terms of fundamental concepts. The second law follows naturally from all of this. Count Iblis (talk) 02:23, 21 September 2012 (UTC)

My suggestion:
Entropy is an artificially constructed quantity obtained by dividing the amount of heat supplied to an object (or emitted from one), by its absolute temperature. It has proved to be very useful in the development of thermodynamic theory. In layman's terms, it can be regarded as an indicator of the usefulness of a particular packet of energy. --DLMcN (talk) 05:30, 21 September 2012 (UTC)
Well that definition works, I suppose, for open systems, where energy is being added or taken away. I'm curious about Count Iblis' idea of starting with Shannon's entropy rather than Clausius'. This is not compatible with the way that the rest of the lede is written, but, if it is actually easier to tie into both macroscopic and microscopic entropy, then perhaps this is a better way. Such elucidation would need to come rather quickly, so I'm curious as to how the Count would tie this all together. Personally, if I was to start from scratch, rather than trying to expand what is already there, I might try to start with something like: "Entropy is a measure of the randomness of a system, the distribution of energy within a system, or the measure of uncertainty of the system. In thermodynamics, a closed system of uneven temperatures has low entropy, while a closed system at thermal equilibrium is at maximum entropy." or something like that. Then, perhaps a brief description of how it applies to statistical entropy, and finally information entropy, all in the first paragraph. In the next paragraph I would probably go into increasing entropy, how it applies to the second law, etc... Anyhow, I'll be gone for the next week or so, so I'll let you all work out a better definition. (It's really nice to be a part of this type of collaboration, for a change, and i thank you all for your responses.) Zaereth (talk) 06:10, 21 September 2012 (UTC)
I'm glad to see that someone took the initiative and changed the first sentence while I was away. It reads much better this way. It still doesn't cover every aspect, but doing so is trickier than I had originally thought. Perhaps I'll sleep on it some more, and try out some ideas, here, sometime in the future. Zaereth (talk) 23:41, 27 September 2012 (UTC)
@Zaereth: Sorry to have gone off-air on you last week. I'd started writing a reply, but then some quite pressing things IRL had to get dealt with.
I would agree that the new first sentence is a marked improvement on what was there before; and it may well be close to the best way forward. On the other hand, the way the definition (the measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work) is true is perhaps a bit more subtle than the average reader first encountering the sentence might suspect, so I'm a bit concerned that if we're going to use this as our headline statement of entropy, without misleading or confusing people, then the statement needs to be unpacked a bit, and the sense in which it is true needs to be explained somewhere -- perhaps the first paragraph below the fold; or an explanatory footnote, perhaps.
To be a bit clearer about what I'm getting at: the statement we now give is true, but only when two rather non-obvious points are understood. First, the temperature in "per unit temperature" is not the system temperature, but is the temperature of the coldest accessible heat sink the system can dump heat into. Secondly, the "thermal energy unavailable to do work" is also not a state variable of the system: rather, it too changes, depending on how cold a heat sink can be found. Contrary then to what might be first impressions, neither of these quantities is therefore a state variable, but it is the surprise proposition of classical thermodynamics that their ratio S is a state variable.
"A measure of the energy unavailable to do useful work" is a standard first-line definition of entropy in dictionaries of physics and dictionaries of science. As a form of words, it always makes me uncomfortable though, because entropy is absolutely not an energy. For that reason, I do think our new "unavailable energy per unit temperature" is better. But if we're going to run with that, I do think we need to make clear: temperature of what; and that (unlike the entropy) the availability/unavailability is not a state property depending only on the system state itself.
The other thing I'd say is that there does seem now to be a huge amount of repetition and duplication in the article as it now stands, without (to me) now any very clear sense of shape or structure. It seems to me that a few years back the article was much more streamlined and clearly structured, and all-in-all in better shape. Since than it looks as if people may have copied and pasted in various headline sections from various other articles, without much sense of the structure and shape of what was already here. So some judicious pruning and shaping and trimming would (I think) be very welcome. Jheald (talk) 07:23, 29 September 2012 (UTC)
One other tiny thing that for me just slightly jarred is the phrase per unit of temperature. I'm happy to talk about "per unit" of some extensive quantity like volume or mass, that can be parcelled up into separate pieces; or even "per unit" of a temperature range, where the change in temperature can be parcelled out degree by degree. But here the resonance seems just slightly wrong, because we're not thinking of either the system temperature or the reservoir temperature changing, so any association with stepping the temperature unit by unit seems just slightly deceptive. Jheald (talk) 10:56, 30 September 2012 (UTC)
Hi Jheald. Thanks for your response. I understand about real life because it often gets in my way too, so no need to apologize. In fact, I'm rarely online on weekends, and only happened to stop by my computer today.
You make a valid point, although I might also point out that the main noun in the sentence is "measure," which leads to the question, "A measure of what?" So, as written, the sentence doesn't say that entropy equals energy, but that entropy is equivalant to a measure of energy states. (The difference between energy states, perhaps?) My typical M/O is to let it stew in my subconscious for a while, and come back when an idea of how to concisely word something comes to mind. I'll think about your suggestions while I'm gone for the next few days, and perhaps I may be able to come back with an idea or two we can toss around. Thanks again for your input and any ideas you can think of. Zaereth (talk) 22:17, 29 September 2012 (UTC)
Np. I'm aware that I never got back to you with what my ideal clean and succinct "one line" explanation of entropy would be; indeed I'm not sure I have one. For a slightly longer pitch, I think I'd probably still cleave to something pretty close to the diff I suggested before, that you weren't so impressed with, which to me combines a number of standard one-line definitions -- "it's a state function"; "it's a measure of unusable energy"; "it's a measure of disorder/mixedupness"; "it's the amount of uncertainty remaining given a macroscopic specification" -- in a way that I hope presents them understandably and as compatible and related. There's probably a bit more that should be added or amplified, that may make the lead feel slightly dissatisfying until it is there; but this I still think is my best shot, so I'm happy to leave it for others to take up and take further or not as they wish.
Taking a look at other snapshots of the article in (say) six-monthly jumps may also throw up some other alternative approaches that might be possibilities to re-consider, going forward. (As well as perhaps being of interest in its own right, as a case-study of how a text that particular people feel isn't quite right yet can end up getting blown about, hither and yon). Jheald (talk) 11:28, 30 September 2012 (UTC)

I've been thinking about this since my last post here. I've read through lots of books and websites alike. I thought it would be easy enought to come up with one sentence which could sum up what entropy means, but, it turns out, there are as many different definitions as there are people giving them. I'm beginning to agree with Swinburne, in that perhaps we still lack the vocabulary, or even the fundamental understanding of the concept, to put it eloquently in words. (In fact, check this out if you want to see what this exact discussion looked like a hundred years ago.)

In all of this, however, there appears to be one definition that all branches of science can agree upon. It alnost seems too obvious, yet it may be like the forest that cannot be seen through all of the trees. Perhaps it may be best to start by saying that entropy is a ratio (joules per kelvin) or a rate measurement, similar to miles per hour (velocity), cents per dollar (profit margin), or ounces per gallon (mixing rate). Maybe, if we start with a very simple statement like that, the rest of the lede and, hopefully, the article will make more sense to the newcomer. (Actually, I think the analogy to profit margin is a rather good one for negentropy, because its easy for anyone to visualize, whereas entropy is more like the expense margin.) Zaereth (talk) 01:28, 4 January 2013 (UTC)

Saying entropy is a ratio says nothing, really. I'm in favor of trying to draw on people's intuitive notion of entropy in the lede. If you look at a movie running forward and backwards (sound off), you can tell when its running forward and when its running backward. Your intuitive understanding of entropy allows you to make that decision. No law of physics is broken in a backward running movie except the second law of thermodynamics, which is essentially a definition of entropy. Broken glass reassembles, beer jumps out of a glass into a pitcher, melted ice unmelts, etc. PAR (talk) 21:56, 4 January 2013 (UTC)
I understand that, and I agree. However, we have to start somewhere. The question is, how do we get from here to there? Nearly every book on non-fiction writing recommends the same thing: Start with a sentence that is all-encompassing, even though it may be rather vague, and then immediately expand on that definition. Since everyone can seem to agree on the mathematical definition, perhaps it is best to start with that. Just as it is meaningless to describe speed in terms of distance only, the reader should be made aware, right off the bat, that it is meaningless to describe entropy in terms of temperature or energy alone. Entropy is a concept that requires a comparison of the two.
I should note that I'm only thinking about adding a sentence to what we already have, in order to clarify what the next sentence (which is currently the first sentence) means. Of course, this "pointing out the obvious" is mainly for the newcomer, but it may help give them a point of reference which can help elucidate the rest of the text. Zaereth (talk) 22:53, 4 January 2013 (UTC)
  1. ^ B. H. Lavenda, "A New Perspective on Thermodynamics" Springer, 2009, Sec. 2.3.4,
  2. ^ S. Carnot, "Reflexions on the Motive Power of Fire", translated and annotated by R. Fox, Manchester University Press, 1986, p. 26; C. Truesdell, "The Tragicomical History of Thermodynamics, Springer, 1980, pp. 78-85
  3. ^ J. Clerk-Maxwell, "Theory of Heat", 10th ed. Longmans, Green and Co., 1891, p. 155-158.
  4. ^ R. Clausius, "The Mechanical Theory of Heat", translated by T. Archer Hirst, van Voorst, 1867, p. 28.
  5. ^ M. Planck, "Theory of Heat", translated by H. L Brose, Macmillan, 1932, part 4, chapter 1
  6. ^ L. Boltzmann, "Lectures on Gas Theory", translated by S. G. Brush, Univ. California Press, 1964, part 1 sec.6; P. Ehrenfest and T. Ehrenfest, "The Conceptual Foundations of the Statistical Approach in Mechanics", translated by M. J. Moravcsik, Cornell Univ. Press, 1959, sec.12
  7. ^ W. A. Whitworth, "Chance and Choice", 3rd ed. Deighton Bell, 1886
  8. ^ M. Planck, "The Theory of Heat Radiation", translated by M. Masius, American Institute of Physics, 1988, p. 145
  9. ^ R. C. Tolman, "The Principles of Statistical Mechanics", Oxford Univ. Press, 1938, sec. 23
  10. ^ B. H. Lavenda, "Statistical Physics: A Probabilistic Approach", Wiley-Interscience, 1991, chapter 1
  11. ^ Higgs, P. G., & Pudritz, R. E. (2009). “A thermodynamic basis for prebiotic amino acid synthesis and the nature of the first genetic code" Accepted for publication in Astrobiology
  12. ^ Nelson, P. (2004). Biological Physics, Energy, Information, Life. W.H. Freeman and Company. ISBN 0716743728
  13. ^ Peterson, Jacob, Understanding the Thermodynamics of Biological Order, The American Biology Teacher, 74, Number 1, January 2012, pp. 22-24