I saw this curve at a website, but I didn't understand it. What is the lower point of entropy, and what are the points A, B, and C? Thanks!
Answer
For reference, the diagram is below.
First, I'm just going to give a quick explanation of entropy. Before Boltzmann, people knew about entropy, they just didn't explain it correctly; namely, they thought of it as a measure of the uselessness of arrangements of gas. As an example, they thought that if you had a box, and all of the gas molecules in the box were on one side of that box, that was low entropy, because they could extract useful work from it by letting it into the other side of the box. However, if the gas was spread uniformly throughout the box, that was high entropy, because any movement of the gas would require adding energy.
Boltzmann, on the other hand, thought (correctly) that entropy was the number of ways a system could be rearranged without anyone noticing. To put this succinctly: let's say Bob and Joe have an apartment, and Joe comes to Bob, obviously upset, saying "We've been ransacked!" Bob thinks this is nonsense, and as evidence points to his room: "There's a couple t-shirts on the floor, couple of crushed soda cans, sheets unorderly. Nothing's different!" But Joe says, "No, no, come to my room! See, the Shakespeare plays are out of alphabetical order, my musical collection is all messed up, and the bed sheets are unmade instead of made! Ransacked!"
Now, think about Bob's room. There are probably a ton of different ways you could rearrange it without Bob noticing, right? You could throw an extra shirt onto the ground, you could leave three shirts inside out instead of two, you could scatter one a little bit to the left. You could move the sheets on the bed back a few centimeters. All those different ways you could leave the room different without Bob noticing is the level of entropy. There are very few - maybe even zero - ways to rearrange Joe's room without him noticing, so his room is low entropy. But since there are many ways you could rearrange Bob's room so that it is different but so that he will not notice, it is high entropy.
To translate this to real life, let's say you have a tea kettle, and it's giving off a bunch of steam. Pass your hand through the steam molecules, and it still looks the same, right? The system is high entropy. Now, stack some wooden blocks. Knock them over. You notice, right? The system is low entropy. Going back to the box analogy, there are far fewer ways for for the atoms of gas to arrange themselves on one side of the box as opposed to spreading out throughout the box. If the atoms are all on one side of the box, it is low entropy. If they are spread throughout, it is high entropy.
With this new understanding of entropy, Boltzmann was able to derive the Second Law of Thermodynamics in a statistical sense. To explain this simply, there are far more ways for a system to be high entropy than low entropy, so it is no wonder systems naturally increase in entropy, but systems do not naturally decrease in entropy. This led to a rather unexpected consequence - namely, his definition explained why entropy tended to increase, but not why entropy was so low in the first place. This was now a problem for cosmologists - why did the early universe have such low entropy? Now, Boltzmann solved this problem, but first, it is important to point out that Boltzmann's definition of entropy only holds statistically. Going back, once again, to the box, it is not certain that the molecules of gas will spread out through the box. There is a very low probability that random motions of the molecules of gas will bring them all to one side of the box (so low that the uniformity of the gas molecules will exist much longer than than the age of the observable universe).
Boltzmann decided to solve the puzzle of the early universe's low entropy by taking advantage of this statistics-only limitation. Instead of the box of gas, think of the entire universe (in a box, if you like). Imagine that it is in thermal equilibrium - in other words, the highest state of entropy possible. Now, since it's the highest state of entropy possible, the entropy cannot possibly increase, so it'll stay steady...except for fluctuations. We can calculate how likely fluctuations are. As can be expected, larger fluctuations are exponentially less likely than small ones, but every type of fluctuation will eventually happen. In other words, maybe our universe is in a state of fluctuation away from its normal equilibrium. The low entropy of the early universe, according to this idea, is a "statistical accident". Okay, now for the graph: we think that we live at either point A or point B. A and B are utterly indistinguishable - people living in A consider the direction to the left the "past" and people living in B consider the direction to the right the "past".
Okay, now it gets a little more complicated. After this, Boltzmann used anthropic reasoning to explain why we're in the fluctuation regions as opposed to the vast, vast majority of the universe's time, which is in a state of thermal equilibrium (anthropic reasoning is based off the anthropic principle, which you can learn more about here). The problem with this is that you might as well say, "That's just the way it is." Instead, we have to think, okay, if this is true, what experimental results should we see? What properties should we expect to measure? People have done this, and, well, let's just say there are problems.
The most basic problem (and the one that's relevant to the diagram) is called Boltzmann's Brain. So, the fluctuations that we are talking about, the low entropy fluctuations, are very rare (the lower the entropy goes, the rarer they are). Points like C on the diagram are much more common than points like A or B. So if we find ourselves explaining the low entropy of the early universe with the anthropic principle, we should be in the minimum possible entropy fluctuation that allows for existence. That minimum fluctuation is Boltzmann's Brain...i.e., the fluctuation that allows for a conscious brain with enough sensory inputs to look around and recognize that it exists before going out of existence. These fluctuations are rare, but they are much, much less rare than the type of fluctuation we are in.
So...why are we in the type of fluctuation we are in? We have no idea. Honestly, none.
Note: I'm including this paragraph for completeness. Feel free to skip, as it is not related to your graph. Okay, so a guy named Josiah Gibbs (1839-1903; Boltzmann lived from 1844 to 1906) theorized a different interpretation of entropy. You can learn more about his ideas here and here. I'll be expanding this section soon with an explanation.
Now, for your final question. The low point of entropy between A and B is, as far as I can tell, supposed to be the Big Bang (or the "beginning of our universe" equivalent). As per my comment, no one is really sure whether or not there was time before the Big Bang - it depends on who you ask. In the graph, Boltzmann was kind of assuming our universe is in a "mother universe" but obviously, since there are problems with his idea, scientists aren't anywhere near sure. Some scientists invoke multiverses and some of them existing before ours to explain the fine-tuning of our universe, but there isn't any evidence for multiverses. In our universe, at least, it is impossible for time to exist before the Big Bang...there might have been something, just not time.
Hope this helps!
Resources
The graph first appeared in a paper by Boltzmann that I am trying to find. He did write several papers about (or with mention of) his "H-curve". I have found three online, for free. One is called Ueber die sogenannte H-Curve, or As to the so-called H-curve. The copy of the paper I have linked to is in German; I could not find an English translation. For reference, this paper was written in 1897. The second is called On Certain Questions of the Theory of Gases, and the section relevant to the H-curve is on the second page of the pdf, in the second column, in the paragraph where the first sentence reads "Let us now take a given rigid vessel..."; the pdf is here.
The third is called On Zermelo's Paper "On the Mechanical Explanation of Irreversible Processes". This one is in English; however, the section on his H-curve is very limited. It starts on the seventh page of the pdf, below Figure 1 in the Appendix. Here is the pdf. This paper was a response to a paper by Ernst Zermelo called On the Mechanical Explanation of Irreversible Processes; this paper can be read here. There were two more papers between Boltzmann and Zermelo in 1896 that I can't find for free online.
You can find more information on this website. There is also a very interesting book called Ludwig Boltzmann: The Man Who Trusted Atoms by Carlo Cercignani which I found a pdf of here. The book has many quotations from Boltzmann's writings. The relevant section is on page 148 (the beginning of Section 6.4, The So-Called H-Curve). A very interesting summary of the Zermelo-Boltzmann papers with extensive quotations is in the Ernst Zermelo - Collected Works, Vol. 2; the link to the Google book sample is available here. The relevant section starts on page 203; it is called "The Zermelo-Boltzmann Controversy". A paper that may be of use (as it is a overview of Boltzmann's work) called Rereading Ludwig Boltzmann can be found here. Last paper: Boltzmann’s H-theorem, its limitations, and the birth of (fully) statistical mechanics by Harvey Brown and Wayne Myrvold can be found here.
No comments:
Post a Comment