tag:blogger.com,1999:blog-1938680703532237797.post6995554365228339113..comments2018-09-05T23:14:39.282-07:00Comments on Spherical Harmonics: What is Information? (Part I: The Eye of the Beholder)Chris Adamihttps://plus.google.com/109210086614267908715noreply@blogger.comBlogger5125tag:blogger.com,1999:blog-1938680703532237797.post-85396525080455416052014-04-15T07:47:37.809-07:002014-04-15T07:47:37.809-07:00One year later. Recently, it struck me, whilst lis...One year later. Recently, it struck me, whilst listening to a friend talking to me about entropy and information and black holes and all, I couldn't help deducting that, indeed, entropy is in the eye of the beholder. However, not being a mathematician nor a physician, merely a layman with a certain interest in such matters, I considered this to be mental jibberish. But it stuck with me, popping up in my mind every so often, so I decided to look into this statement. This is how I ended up here.<br /><br />Thank you for clarifying this.Jürgen De Blondehttps://www.blogger.com/profile/05291967208446258973noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-88778156612312902822013-06-09T16:22:19.181-07:002013-06-09T16:22:19.181-07:00Chris, thanks for your response. I take your point...Chris, thanks for your response. I take your point to be that if we consider the uncertainty to be a logarithmic function of the number of states then it follows naturally that if we have two systems S1 and S2 then our uncertainty regarding the joint system of both S1 and S2 is the sum of our uncertainties regarding the individual systems. More formally, if our uncertainty regarding a system S is considered to be log(N) where N is the number of states of S, then if S1 has N1 states and S2 has N2 states with uncertainties log(N1) and log(N2) respectively, the joint system S3 has N1 x N2 states and thus our uncertainty regarding S3 is log(N1 x N2) = log(N1) + log(N2).<br /><br />I consider Travc's comment to be coming from the opposite direction: If we find it natural to consider that our uncertainty regarding a system is a function of the number of possible states of the system, and that the uncertainty regarding a joint system is the sum of our uncertainties regarding the individual subsystems making up that joint system, then it is natural to define the uncertainty as the logarithm of the number of states.<br /><br />More formally, we are trying to find an expression H(N) for our uncertainty regarding a system with a number of states N. It's natural to conclude H is a monotonically increasing function of N, with H(N1) > H(N2) if N1 > N2. (In other words, the more states the more our uncertainty.) It's also natural to conclude that H(1) = 0. (In other words, if a system can have only one state as measured by us then our uncertainty regarding it is zero.) <br /><br />Based on the example of the coin for which we can measure either heads or tails or the orientation in one of four directions, it's also natural to conclude that H(8) = H(2) + H(4) -- we can consider this to be a case of a measurement against a single system with 8 states, or a measurement against a composite system consisting of two subsystems with 2 and 4 states respectively, with our overall uncertainty being the same in either case, and the uncertainty regarding the composite system being the sum of our uncertainties regarding the subsystems. Generalizing this argument, we'd conclude that H(N1 x N2) = H(N1) + H(N2) for all N1, N2.<br /><br />Taking this last condition together with the other conditions on H we'd conclude that H(N) = logb(N) for some base b. Since our argument didn't force a choice of base we'd conclude that the actual base can be chosen as whatever value we consider to be most convenient, for example b = 2.<br /><br />So the bottom line is that I think I'm unconfused now :-)Frank Heckerhttps://www.blogger.com/profile/09677849594562500648noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-43728337891755833172013-05-29T02:22:22.649-07:002013-05-29T02:22:22.649-07:00The "Our uncertainty adds" bit seemed a ...The "Our uncertainty adds" bit seemed a bit of a jump (or assertion) to me too. It might be better to say the we "really want uncertainties to add (and subtract)". The log formulation leads to easier and more intuitive equations later on, but I think everything could actually be reformulated without the logs.<br /><br />Maybe using your coin side+orientation example would help:<br /><br />If we care/measure the side and orientation of a single coin, it can be in 2x4 states. OK, N1=8<br /><br />But what if we measure the head/tail of one coin and the orientation of a second coin? N2=2 and N3=4<br /><br />If we want uncertainty in the first case (H1) to equal the sum of the uncertainties in the second case (H2+H3), then we just define H = log(N). So, <br />H1 = H2 + H3<br />log(8) = log(2)+log(4)<br />taking log base 2, that is just 3=1+2Travchttps://www.blogger.com/profile/12790548845692414891noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-48649703254783727562013-05-26T20:16:32.651-07:002013-05-26T20:16:32.651-07:00Dear Frank,
you are quite right, It isn't tri...Dear Frank,<br /><br />you are quite right, It isn't trivially obvious. But think of it this way. Let's say I have two systems, with N states each. Then the total number of states that the combined system can be in is N squared: N states in the second for each N in the first. The uncertainty of the combined state is the logarithm of the total number of states (which is N squared). The logarithm of N squared is two times the logarithm of N. So you can see that in this case of two variables with the same uncertainty, the uncertainty of the joint system is the sum of the uncertainty of each. Because the log of x squared is two times the log of x. <br />Chris Adamihttps://www.blogger.com/profile/02447043823985095127noreply@blogger.comtag:blogger.com,1999:blog-1938680703532237797.post-50607440907457142542013-04-27T07:39:16.866-07:002013-04-27T07:39:16.866-07:00Thanks for doing this series. I'm an ex-physic...Thanks for doing this series. I'm an ex-physics and math major, and myself never fully acquired a good understanding of information vs. entropy; I'm glad to see you trying to explain this in terms comprehensible by educated laypersons.<br /><br />I do have one quibble, though: You write: "But our uncertainty about the joint system is not N_1 x N_2. Our uncertainty adds, it does not multiply." Maybe it's just me being dense, but this doesn't strike me as a trivially obvious statement (though it does seems plausible if I think about it a bit). Clearly if one does accept it then taking the logarithm of the number of states follows naturally (at least for anyone with basic mathematical knowledge). But it would have been nice to have a little more explanation on why uncertainty (in the sense you've defined it) is additive in the first place.Frank Heckerhttps://www.blogger.com/profile/09677849594562500648noreply@blogger.com