Talk:Fluctuation theorem

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Untitled[edit]

I tagged this page "too technical" because, even though I don't have a physics degree, it should still be able to inform me as to why Loschmidt's paradox (which I could understand) isn't a problem. ~~ N (t/c) 04:21, 6 November 2005 (UTC)[reply]

I hardly understand anything either how is entropy defined in this article? What does an average over entropy production mean, since entropy is already defined by summing over the whole distribution of states. It seems that this article somehow associates an entropy to every microstate, or something like that.ThorinMuglindir 19:54, 6 November 2005 (UTC)[reply]

Theorem Assumptions: Time Reversal Symmetry[edit]

The list of assumptions required to prove the Fluctuation Theorem is very interesting (Part of the "Summary" section toward the end of the article). There was one thing that was unclear to me. Quoting the article, "In regard to the [assumption of time reversal symmetry], all the equations of motion for either classical or quantum dynamics are in fact time reversible." If I remember my physics correctly, to reverse the trajectory of a charged particle in a magnetic field, not just its velocity but also its charge must be reversed. Does that mean that a collection of charged particles in an external magnetic field will not obey the Fluctuation Theorem (and by extension the Second Law of Thermodynamics) unless they are capable of charge reversal? Or, is the assumption of time reversal symmetry somehow independent of (lack of) charge reversal? Compbiowes 00:38, 4 October 2006 (UTC)[reply]

About the magnetic field

Well, I think if you treat magnetic field as one generated by current flow or moving charges, then time reversal will also imply changing the direction of current flow, and thus reversing magnetic field too.

That's a good point. What I wonder about, though, is the time scale. If we mix classical and quantum ideas then it seems that the "current flow" can be "stuck" in a particular direction. More concretely, a permanent magnet can maintain its field for a very long time. Is it relevant to the Fluctuation Theorem that a system of charged particles in an external field generated by a permanent magnet would have to wait a long time for the field to reverse? If the system of charged particles could violate the Second Law and bring about a decrease in entropy as long as the field lasted, could this decrease in entropy then be used to regenerate the field? Compbiowes 00:50, 19 October 2006 (UTC)[reply]
As I tried to explain in an edit, reversibility means that for any system you choose, it would be possible to create a different system with different initial conditions whose evolution over time would look like a reversed movie of the first system. And the magnetic field of a permanent magnet is generated by the "spin" of the electrons, if you picture the spin in classical terms than a backwards movie would mean reversing the direction of all the spins. Even though quantum mechanics doesn't actually allow you to think of spin as an electron spinning on its axis or orbiting the nucleus, I think that substituting -t for t in the equations would mean reversing all the spins in QM as well, so the second system would look like the first but with the field pointing in the opposite direction. Hypnosifl 07:10, 20 October 2006 (UTC)[reply]
I can't add any more to this discussion myself without further study. I did email Denis Evans, though, and his unofficial answer was "We have only done a little on [the Fluctuation Theorem for systems in external magnetic fields]. What happens is that you need something slightly more complex than the time reversal map. It will all work but the mappings may be slightly different." Compbiowes 20:15, 20 October 2006 (UTC)[reply]
I've seen it stated by numerous physicists that both the laws of classical electromagnetism and the laws of quantum electrodynamics exhibit time-symmetry, meaning the fundamental equations of motion are unchanged by a reversal of which time direction you label positive and which you label negative. If you email Denis Evans about this precise question I'm sure he'll confirm, the quote above may be talking about issues related to some specific experimental setup but I doubt he's arguing that the fundamental equations governing the situation fail to exhibit time-symmetry. Hypnosifl 20:40, 20 October 2006 (UTC)[reply]
A phase difference between oscillations at small time and distance scales allows those oscillations to create an interference resulting in a collective oscillation existing at larger time and distance scales.
If the universe exhibits cyclical behavior at larger time scales and larger distances than currently understood, time-reversible laws may create an illusions of irreversibility. If the universe here and the universe 10 billion light years away have vastly different density (which would be a disproof of the cosmological principle), then it is possible that a region with a density very close to that of a black hole event horizon (i.e. to the corresponding Schwarzschild radius) could serve as a sink for entropy flux density (WK-1m-2), and thereby recover otherwise "missing" potential to do work.siNkarma86—Expert Sectioneer of Wikipedia
86 = 19+9+14 + karma = 19+9+14 + talk
00:39, 2 April 2013 (UTC)[reply]

Is "time-averaged irreversible entropy production" just (change in entropy)/(time)?[edit]

I would think it's just equal to delta-S (change in entropy) divided by delta-t (the time interval), but I'd like some confirmation from an expert. If this is true, than it might help make the article slightly more accessible if this was mentioned somewhere. This would also mean that the fluctuation theorem could be restated in terms of changes in entropy, so that for any given time-interval, the ratio between (probability that entropy change is +delta-S) and (probability that entropy change is -delta-S) would be e^(delta-S). Hypnosifl 18:04, 20 October 2006 (UTC)[reply]

Yep, it seems so. "Entropy Production" i think is misleading, it would be better called the entropy production rate. i found another paper which refers to A as the entropy creation rate.
This suggestion would allow to rewrite the first paragraph to make it less technical and more accessible. I suppose the current text (implicitly) normalized entropy by dividing by the Boltzmann constant? Benjamin.friedrich (talk) —Preceding undated comment added 08:20, 19 December 2018 (UTC)[reply]

About the relation between entropy and information[edit]

Hi. I've drawn this graphic. I'd like to know your comments about the idea it describes, thank you very much.

--Faustnh (talk) 00:07, 29 March 2009 (UTC)[reply]

(Also posted at Entropy and information talk page). --Faustnh (talk) 18:14, 29 March 2009 (UTC)[reply]

It looks like -∫ u(x) log(u(x)) dx is more negative in the lower picture so yes, I agree. There's less entropy in the lower picture. Oddly, that means the upper picture represents more states! 89.217.26.52 (talk) 22:40, 2 February 2015 (UTC)[reply]
"Water" implies H2O which implies Avogadro's number type entropies. A surface wave adds dozens of bits of info. The scales are incomparable. It's "true" in some sense, but misleading. 67.198.37.16 (talk) 05:54, 24 March 2024 (UTC)[reply]

is this Crook's or Gallavotti? no mention of Onsanger ?[edit]

I was trying to get background on another aarticle, http://en.wikipedia.org/wiki/Law_of_Maximum_Entropy_Production which I continue to solicit input but is this FT the same as Crook's or Gallavotti-Cohen? Nerdseeksblonde (talk) 12:39, 26 September 2009 (UTC)[reply]

Backstepping of biological nanomachines[edit]

The article states that biological machines such as molecular motors occasionally can run in reverse mode. A reader (like me) might then think that backstepping and a reverse forward step are the same. However, I fear this simple view is wrong: it is not that clear that a motor produces an ATP during a backward step. This is probably true for F1-ATPase, but not for kinesin. It would be good to have a clarifying sentence on this in the article.

Some Literature:

Nishiyama et al. Nat Cell Bio. 2002 "Chemomechanical coupling of the forward and backward steps of single kinesin molecules"

Taniguchi et al. Nat Chem Bio 2005 "Entropy rectifies the Brownian steps of kinesin"

Carter et al. Nature 2005 "Mechanics of the kinesin step"

I've modified the text, I hope it will be clearer now Conjugado (talk) 14:29, 3 April 2013 (UTC)[reply]

How is the term "entropy" defined in this case?[edit]

I have seen many statistical mechanical definitions of entropy which are not all equivalent, or have subtle distinctions in meaning. All definitions are obviously designed to respect thermodynamics in the thermodynamic limit, however they significantly differ for microscopic systems. I would imagine that the precise definition of the quantity "entropy" is extremely important for the fluctuation theorem, however it's not included anywhere in this article. --Nanite (talk) 15:10, 14 September 2013 (UTC)[reply]

Heh. I've read journal articles where authors accuse each other of being off-by-one in their definition of entropy, leading to erroneous conclusions of "negative entropy" at low temperature ordered systems. The subtleties are subtle. Yes, point taken. 67.198.37.16 (talk) 05:59, 24 March 2024 (UTC)[reply]

The often-mentioned problem with defining entropy[edit]

What is the "averaged entropy" in this article? Since entropy is already an average, I agree with other editors that a very precise definition of entropy is needed to make this meaningful.

Here is an attempt.

The microstates (10^10^10) are grouped into collections called states (10^10), characterized by thermodynamic variables such as p, V, T, S, U. A microstate follows an exact physical trajectory a(t), and a state obeys macroscopic (thermodynamic) laws that prescribes a trajectory M(t). (Or just restrict the allowable trajectories M(t).)

Then: for any fixed s, t, all but a tiny fraction of microstates "obey thermodynamics", that is, if a(s) ∈ M(s) then a(t) ∈ M(t).

A tiny number of microstate trajectories "jump state", that is, they shift from one macroscopic trajectory M(.) to another one M'(.) during the time interval [s,t]. The macroscopic variables of a(.) change in a way that is impossible according to the macroscopic laws. So the macroscopic laws get broken. An example is all the gas going into one corner of the room. 89.217.26.52 (talk) 22:34, 2 February 2015 (UTC)[reply]

Confusing or mistaken visco-elastic example[edit]

It is important to understand what the Second Law Inequality does not imply. It does not imply that the ensemble averaged entropy production is non-negative at all times. This is untrue, as consideration of the entropy production in a viscoelastic fluid subject to a sinusoidal time dependent shear rate shows.[clarification needed] In this example the ensemble average of the time integral of the entropy production is however non negative - as expected from the Second Law Inequality.

This seems to contradict the immediately preceding paragraph, which states that the averaged entropy is nondecreasing.

Indeed, the viscoelastic fluid example seems like a red herring. It misinterprets the claim of the article: indeed, the claim is that the ensemble-averaged, instantaneous rate of entropy production is nonnegative for all times, not just integrated over user-selected time windows.

If the thesis in this article is wrong, it needs an attack in principle (from published sources, of course). It won't be disproved or "qualified" by a fly-by example with no explanation OR citation. In any case I suspect the example is not a closed system! It sounds like a driven system. But like most any reader, I can't tell from the amount of detail given. The example should simply be removed for now. 89.217.26.52 (talk) 22:34, 2 February 2015 (UTC)[reply]