Friday, 2 January 2015

statistical mechanics - Isn't the second law of thermodynamics just the law of large numbers in disguise?


Loosely speaking,





  • The second law of thermodynamics (SLT) says that as a closed system evolves, its macrostate tends toward the one with the greatest possible number of indistinguishable microstates.




  • The law of large numbers (LLN) says that as the number of observations of a random event increases, the average of the observed values tends toward their expected value... which is precisely $1/n$ of the value that has the greatest possible number of ways in which it could be obtained through summing $n$ observations.1




The way I see it, if you realize that "evolution" (passage of time, a physical concept) is really just "having more opportunities for observations" (observing more samples, a mathematical concept), then these statements are saying the same thing.


Or, to put it another way, the SLT is implied by the LLN. Or to be completely blunt:
It is mathematically impossible for the 2nd law of thermodynamics not to hold.



Based on this, my questions are:




  1. Am I correct in my reasoning/intuition here?




  2. If yes: Why is the 2nd law of thermodynamics even considered a physical law at all? Provable statements aren't physical laws (nobody calls the Pythagorean theorem a physical law), so what sense does it make to regard this as one?


    If no: Why not?







Edit:


I just remembered there's also the data-processing inequality (DPI), which states that, if $X \to Y \to Z$ is a Markov chain (formally meaning that $X$ and $Z$ are conditionally independent given $Y$, and informally meaning that "$X$ only influences $Z$ via an intermediate $Y$"), then $H(Z \vert Y) \leq H(Z \vert X)$, meaning that the additional entropy (information content) of knowing $Z$ is lower if we already know $Y$ rather than if we only know $X$. I feel like this might help with the above interpretation but it's not entirely obvious to me exactly how.


1 For what it's worth, the asymptotic equipartition property (AEP) in information theory is the analog of the LLN in that field, and given that it directly deals with entropy just like the SLT, it might make more sense for those familiar with it to use that as a comparison point, but I'll focus on the LLN here since it's more accessible (including to myself!).




No comments:

Post a Comment

Understanding Stagnation point in pitot fluid

What is stagnation point in fluid mechanics. At the open end of the pitot tube the velocity of the fluid becomes zero.But that should result...