"The question you are asking appears to be subjective, and is likely to be closed." Challenge . . . ACCEPTED. Okay, here it is.
A friend prone to uplifting aphorisms posted on Facebook: "You Are An Infinite Possibility." I thought about that and it seems to me that, given the Bekenstein bound, one can calculate the maximal number of possible states, in principle, of the human brain (given an average volume). Lo and behold, Wikipedia revealed that someone's actually done the calculation and gives the number as less than approximately $10^{7.8\times 10^{41}}$ states.
On the other hand, if you model the brain as a quantum system and calculate its evolution over time with the Schrödinger equation --well, the wavefunction is continuous, so doesn't it in principle describe an infinite number of possible states?
Answer
This is a frequently-encountered 'boobytrap' in information theory, but it turns out that having a continuous degree of freedom does not entail free access to an infinite amount of information.
First of all, while the wavefunction is continuous, you can discretize it quite easily. We know every nucleus and electron in your brain is confined to within a box, say, 20cm a side, so we can describe it in that basis. You still have an infinite number of eigenstates, but it is now discrete. This shift in perspective from uncountable to countable infinity is due to the fact that physically accessible wavefunctions must be continuous and smooth, and there's not actually that many of those.
So far, we still have an infinite number of eigenstates in our description. However, we know that the energy content of the human brain is bounded, which means that after a given point, all the energy eigenstates must have negligible contribution. Indeed, if a sizable fraction of the electrons in our brain had energies above, say, 1 GeV, we would instantly come apart in a blaze of gamma rays and positrons and whatnot. Informationally, this means that you can provide an approximation to the wavefunction that's good for all practical purposes using only a finite number of states. An experiment that would distinguish states at that level of approximation would need so much energy it would incinerate your brain.
This 'paradox' is also present in classical information theory, and it comes about when you ask what the information capabilities of analog computing are. Here the state of a system is encoded in, say, a voltage, and in principle you have infinite information there because you can in principle measure as many digits as you want. However, it turns out that the scalings tend to be unfavourable, and noise kills you quickly; the upshot is that in analog computing you need to be very careful about what precision / tolerance you demand of your noise and measuring apparatus when counting information capabilities. As it happens, high precision tends to be harder to achieve than simply having more, coupled systems, with simply one bit per system.
No comments:
Post a Comment