tag:blogger.com,1999:blog-1938680703532237797.post9200356940898554506..comments2018-03-22T03:45:43.230-07:00Comments on Spherical Harmonics: What is Information? (Part 4: Information!)Chris Adamihttps://plus.google.com/109210086614267908715noreply@blogger.comBlogger14125tag:blogger.com,1999:blog-1938680703532237797.post-16416866331920305832014-06-09T23:10:04.747-07:002014-06-09T23:10:04.747-07:00The trivial explanation would be Liouville's t...The trivial explanation would be Liouville's theorem in classical mechanics, or unitarity in quantum mechanics. In short, the process must be reversible. The fact that no two microstates evolve to the same microstate means that the system cannot do any concentration for you, you have to do it yourself. Jacob Thoenneshttps://www.blogger.com/profile/03297180991149775653noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-64360613280889879092014-04-23T01:09:30.443-07:002014-04-23T01:09:30.443-07:00The reason I gave the example of the absorbing Mar...The reason I gave the example of the absorbing Markov chain is to illustrate that the statement that "if you don't measure anything, then all that can happen to you as a system equilibrates is that you lose information" is not trivial. It holds for the kinds of processes one studies in thermodynamics (ergodic, detailed balance) but not in general Markov processes and it is not trivial to explain why, or is it?Gustav Deliushttps://www.blogger.com/profile/06126270799697522648noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-34617565814424124692014-04-22T19:51:33.753-07:002014-04-22T19:51:33.753-07:00Hi Gustav,
the entropy of a Markov process (certa...Hi Gustav,<br /><br />the entropy of a Markov process (certainly a well-defined thing that I also have written about) is not the same thing as the thermodynamical entropy of a system. In fact, the 2nd law has no chance of governing the entropy of a Markov chain, because that depends (as you correctly point out) on the dynamics of this process, and if you have an absorbing boundary condition that entropy could even vanish. A thermodynamical system is different. In a sense, the transition rules for the Markov process describing the physical system variable is very special. The reason why the 2nd law is trivial when written in terms of a conditional entropy is that if you don't measure anything, then all that can happen to you as a system equilibrates is that you lose information. And that makes the conditional entropy go up. As I said, I should write an entire post about all that. I wrote about it a little bit in http://arxiv.org/abs/1112.1941Chris Adamihttps://www.blogger.com/profile/02447043823985095127noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-50371872733967081662014-04-22T03:15:22.126-07:002014-04-22T03:15:22.126-07:00Furthermore I find it confusing that you say that ...Furthermore I find it confusing that you say that the unconditional entropy must stay constant. The unconditional entropy H(X(t)) of some stochastic process X(t) does not usually stay constant. It appears that you use the term "conditional entropy" in a different way from others in that you call the entropy H(X) of any random variable X "conditional" unless X is uniformly distributed. Is that correct? Do you not find that confusing?Gustav Deliushttps://www.blogger.com/profile/06126270799697522648noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-16836253964445555802014-04-22T00:36:46.084-07:002014-04-22T00:36:46.084-07:00Chris, why do you say that it is a triviality that...Chris, why do you say that it is a triviality that the conditional entropy increases. As the example of the absorbing Markov chain seems to show, in some systems the conditional entropy decreases with time. The conditional entropy I am thinking of here is H(X(t)|X(0)) where X(t) describes the state of the chain at time t. If it is an absorbing chain with a single absorbing state, the probability distribution for X(t) becomes more and more concentrated on that one state as time progresses and hence the entropy decreases with time.Gustav Deliushttps://www.blogger.com/profile/06126270799697522648noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-86594271631477507582014-04-21T20:07:30.101-07:002014-04-21T20:07:30.101-07:00OK, I see that I may have to devote an entire blog...OK, I see that I may have to devote an entire blog post to the relationship between thermodynamic and Shannon entropy, as I get asked this a lot. So, in anticipation of that post, here's the short version. The quantitiy that is increasing when non-equilibrium systems "equilibrate themselves" is a conditional entropy. The unconditional entropy must stay constant. That is why the 2nd law does not exist. And the fact that the conditional entropy increases (in just the way that Boltzmann observed) is just a triviality. Whereas in standard thermodynamics, it was a mystery. Well, it was a mystery because it was wrong. <br />Chris Adamihttps://www.blogger.com/profile/02447043823985095127noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-69271794606001236012014-04-21T11:38:03.283-07:002014-04-21T11:38:03.283-07:00Chris, I would also be interested in hearing more ...Chris, I would also be interested in hearing more about what you mean when saying that the second law does not exist. It is a fact, is it not, that it depends on the particular dynamics that govern a system whether entropy increases or decreases. For example in a system that is described by an absorbing Markov chain with an absorbing state, entropy decreases with time as the probability becomes more and more concentrated in that state. The second law of thermodynamics states that nature is not described by such a system but by one in which entropy increases. In what sense does that law not exist?Gustav Deliushttps://www.blogger.com/profile/06126270799697522648noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-56176639128194626172014-04-18T01:05:36.316-07:002014-04-18T01:05:36.316-07:00Is this to say that entropy in the context of info...Is this to say that entropy in the context of information is a different beast from entropy in the context of thermodynamics? It seems they are extremely analogous but strain to meet at the ends. <br /><br />Also I'm curious if you could expand on this:<br /><br />"The 2nd law is one of the silliest "laws" of all of physics. It does not exist. It is a consequence of not understanding information theory."<br /><br />See I have my thermodynamics exam coming up, and this sort of shakes my ground a bit.<br /><br />Thank youUnknownhttps://www.blogger.com/profile/18100712816641199501noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-23388563958819944142013-07-04T00:58:19.454-07:002013-07-04T00:58:19.454-07:00Chris, it is nice to hear someone say that so blun...Chris, it is nice to hear someone say that so bluntly. Gustav Deliushttps://www.blogger.com/profile/06126270799697522648noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-74546562384163012492013-07-03T18:16:27.737-07:002013-07-03T18:16:27.737-07:00The way you describe how most physicists think abo...The way you describe how most physicists think about the 2nd law is, I think, precisely right. They DO think that something happens, when in fact they just lose information. You can push this to the extreme by the way, if you believe that the universe has a wavefunction. Then you replace the 2nd law by the statement that the entropy of the universe is constant (and zero) for all time, but that the local (apparent) entropy increases, mostly due to the fact that the universe expands. The 2nd law is one of the silliest "laws" of all of physics. It does not exist. It is a consequence of not understanding information theory. Can't blame Boltzmann, though. But Feynman should have figured it out. He knew Shannon, but I don't think he read his work. Chris Adamihttps://www.blogger.com/profile/02447043823985095127noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-85334801568814589372013-07-02T23:26:06.110-07:002013-07-02T23:26:06.110-07:00I second that motion.I second that motion.Gustav Deliushttps://www.blogger.com/profile/06126270799697522648noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-47467547493063460422013-07-02T17:42:47.896-07:002013-07-02T17:42:47.896-07:00PS: I move (in the parliamentary sense) that we ju...PS: I move (in the parliamentary sense) that we just abolish the use of the term "entropy" outside the realm of thermodynamics (as in S). "Uncertainty" seem to cause a lot less confusion.Travchttps://www.blogger.com/profile/12790548845692414891noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-86188222915715645132013-07-02T17:37:54.668-07:002013-07-02T17:37:54.668-07:00I think you flipped eq (1) and (2) in the sentence...I think you flipped eq (1) and (2) in the sentence: "(1) is information, but I may not be aware how I got into possession of that information. (2) tells me exactly the source of my information: the variable Y."<br /><br />As for the distinction... I sort-of disagree. I don't think (2) is really all that necessary, though of course I could be wrong. What I assert is that:<br />*The information content of a system is not defined except with respect to some other system.*<br /><br />Essentially, I'm happy just avoiding the need for a definition of non-conditional information. Yeah, it is logically there... It is just the information of X conditional with everything else you know about in the universe, but that doesn't seem particularly useful to me.Travchttps://www.blogger.com/profile/12790548845692414891noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-18277450253227682762013-06-30T02:58:31.789-07:002013-06-30T02:58:31.789-07:00Your concept of entropy is natural to a probabilit...Your concept of entropy is natural to a probability theorist, but not so much to most physicist, I would guess. Most physicists appear to assign real physical significance to the concept of entropy. When a physicist thinks about the second law of thermodynamics, which states that entropy in a closed system increases with time, she does not just think that this is a restatement of the fact that we loose information about the state of the system as time goes on, but rather she thinks that this says something about the actual physical system itself. For example physicists talk about the "heat death" of the universe; the idea that when the universe reaches maximum entropy that then no physical processes can take place any more. Am I right that you would say that this idea is misguided? That the fact that we have lost all information about a system does not mean that the system itself has changed its behaviour in any way? That the fact that we, due to our lack of information, can no longer extract work from the system does not imply that the system itself dies?Gustav Deliushttps://www.blogger.com/profile/06126270799697522648noreply@blogger.com