Wednesday, 5 July 2017

soft question - Physical structures that trap information


I labeled this question "soft" because it might not make any sense.


Anyway, what I'm wondering is if there's a notion in physics that deals with the ability of matter to trap information.


For instance, if I get in the way of a red light and see it, then my eye and my brain have taken the information about the wavelength of that light and physically persisted it.


This seems like it might be a fundamental characteristic of structures that lead to life as we know it.


Is there any current branch of physics that gets into a topic like this or am I getting into philosophy here?



Answer




There are two ideas in physics that I can think of that may interest you.


In the field of Black Hole Thermodynamics there is the Berkenstein Bound (see the Wiki page with this name), which is the maximum amount information that can be encoded in a region of space with radius $R$ containing mass-energy $E$, it is:


$$I\leq \frac{2\,\pi\,R\,E}{\hbar\,c\,\log 2}$$


where $I$ is the number of bits contained in quantum states of that region of space. This bound was derived by doing a thought experiment wherein Berkenstein imagined lowering objects into black holes see this question and then deduced the above bound by assuming that the second law of thermodynamics holds. It works out to about $10^{42}$ bits to specify the full quantum state of an average sized human brain. This is to be compared to estimates of the Earth's total computer storage capacity, which is variously reckonned to be of the order of $10^{23}$ bits (see the Wikipedia "Zettabyte" page, for instance) as of writing (2013).


Another thermodynamic idea: there is a way to account for the Maxwell Daemon and the Szilard Engine that makes these two thought experiments comply with the second law of thermodynamics in the long term and that is through Landauer's Principle: the idea that the merging of two computational paths or the erasing of one bit of information always costs useful work, an amount given by $k_B\,T\,\log 2$, where $k_B$ is Boltzmann's constant and $T$ the temperature of the system doing the computation.


This argument was finalised by Charles Bennet and is directly related to the limited ability of matter, or rather "physical systems" to "trap" or "soak up" information as you say. An excellent reference paper here is Charles Bennett, "The Thermodynamics of Computation: A Review", Int. J. Theo. Phys., 21, No. 12, 1982. Here's how it works.


Bennett invented perfectly reversible mechanical gates ("billiard ball computers") whose state can be polled without the expenditure of energy and then used such mechanical gates to thought-experimentally study the Szilard Engine and to show that Landauer's Limit arises not from the cost of finding out a system's state (as Szilard had originally assumed) but from the need to continually "forget" former states of the engine.


Probing this idea more carefully, as also done in Bennett's paper: One can indeed conceive non-biological simple finite state machines to realise the Maxwell Daemon - this has been done in the laboratory! see at the end - and as the Daemon converts heat to work, it must record a sequence of bits describing which side of the Daemon's door (or engine's piston, for an equivalent discussion of the Szilard engine) molecules were on. For a finite memory machine, one needs eventually to erase the memory so that the machine can keep working. However, "information" ultimately is not abstract - it needs to be "written in some kind of ink" you might say - and that ink is the states of physical systems. The fundamental laws of physics are reversible, so that one can in principle compute any former state of a system from the full knowledge of any future state - nothing gets lost. So, if the finite state machine's memory is erased, the information encoded that memory must show up, recorded somehow, as changes in the states of the physical system making up and surrounding the physical memory. So now those physical states behave like a memory: eventually those physical states can encode no more information, and the increased thermodynamic entropy of that physical system must be thrown out of the system, with the work expenditure required by the Second Law, before the Daemon can keep working. The need for this work is begotten of the need to erase information, and is the ultimate justification for Landauer's principle.


Lastly, classical thermodynamics itself relates information soaking capacity of materials to heat capacities. The experimentally measured entropy (or, often, that inferred from macroscopic measurements) $S_{exp}$ of a system is the Boltzmann entropy, and this can be shown to overbound the actual Shannon information needed to specify the classical state of the system. This latter information is related to the Gibbs entropy, as discussed in E. T. Jaynes, "Gibbs vs Boltzmann Entropy", Am. J. Phys. 33, number 5, pp391-398 May 1985; Gibbs and Boltzmann entropies are equal when the state constituents are statistically uncorrelated. So the Boltzmann entropy $S_{exp}$ can be thought of as an information capacity; to map it from a quantity in joules per kelvin to bits we compute $\frac{S_{exp}}{k_B\,T\,\log 2}$. The addition of everyday quantities of heat to a system corresponds to very big information additions: the addition of one kilojoule of heat to a system at $300K$ (roomish temperature) corresponds to "complexifying" its state by $3.5\times10^{23}$ bits, roughly equal to our estimate of the Earth's computer systems' total storage capacities in 2013. Another example: the entropy of formation of a mole of water (18 grams) at atmospheric pressure is $70\mathrm{J\,K^{-1}}$, or $7.4\times10^{24}$ bits. The classical microstate of all the molecules in this water would need up to seventy times our estimate of the Earth's computer systems' total storage capacities in 2013 to specifiy completely.





Here is a reference to the actual laboratory realisation of the Maxwell Daemon:



Shoichi Toyabe; Takahiro Sagawa; Masahito Ueda; Eiro Muneyuki; Masaki Sano (2010-09-29). "Information heat engine: converting information to energy by feedback control". Nature Physics 6 (12): 988–992. arXiv:1009.5287. Bibcode:2011NatPh...6..988T. doi:10.1038/nphys1821. "We demonstrated that free energy is obtained by a feedback control using the information about the system; information is converted to free energy, as the first realization of Szilard-type Maxwell’s demon."



No comments:

Post a Comment

Understanding Stagnation point in pitot fluid

What is stagnation point in fluid mechanics. At the open end of the pitot tube the velocity of the fluid becomes zero.But that should result...