Thursday, 26 October 2017

entropy - How does Landauer's Principle apply in quantum (and generally reversible) computing


I understand that a reversible computer does not dissipate heat through the Landauer's principle whilst running - the memory state at all times is a bijective function of the state at any other time.


However, I have been thinking about what happens when a reversible computer initializes. Consider the state of the physical system that the memory is built from just before power up and initialization. By dint of the reversibility of underlying microscopic physical laws, this state stay encoded in the overall system's state when it is effectively "wiped out" as the computer initialized and replaces it with a state representing the initialized memory (set to, say, all noughts).


So it seems to me that if $M$ bits is the maximum memory a reversible algorithm will need to call on throughout its working, by the reasoning of Landauer's principle, ultimately we shall need do work $M\, k\,T\,\log 2$ to "throw the excess entropy out of the initialized system".


Question 1: Is my reasoning so far right? If not, please say why.



Now, specializing to quantum computers, this seems to imply some enormous initialization energy figures. Suppose we have a system with $N$ qubits, so the quantum state space has $2^N$ basis states. Suppose further that, for the sake of argument, the physics and engineering of the system is such that the system state throughout the running of the system only assumes "digitized" superpositions, $i.e.$ sums of the form:


$$\frac{1}{\sqrt{\cal N}}\sum_{j\in 1\cdots 2^N} x_j \,\left|\left.p_{1,j},p_{2,j},\cdots\right>\right.$$


where $x_j, \;p_{k,j}\in{0,1}$ and ${\cal N}$ the appropriate normalization. To encode the beginning state that is wiped out at power up and initialization, it seems to me that the Landauer-principle-behested work needed is $2^N \,k\,T\,\log 2$. This figure reaches 6 000kg of energy (about humanity's yearly energy consumption) at around 140 qubits, assuming we build our computer in deep space to take advantage of, say, 10K system working temperature.


Question 2: Given that we could build a quantum computer with 140 qubits in "digital" superpositons as above, do we indeed need such initialization energies?


One can see where arguments like this might go. For example, Paul Davies thinks that similar complexity calculations limit the lower size of future quantum computers because their complexity (information content) will have to respect the Bekestein Bound. P.C.W. Davies, "The implications of a holographic universe for quantum information science and the nature of physical law", Fluctuation and Noise Lett 7, no. 04, 2007 (see also http://arxiv.org/abs/quantph/0703041)


Davies points out that it is the Kolmogorov complexity that will be relevant, and so takes this as an indication that only certain "small" subspaces of the full quantum space spanned by a high number of qubits will be accessible by real quantum computers. Likewise, in my example, I assumed this kind of limitation to be the "digitization" of the superposition weights, but I assumed that all of the qubits could be superposed independently. Maybe there would be needfully be correlations between the superpositions co-efficients in real quantum computers.


I think we would hit the Landauer constraint as I reason likewise, but at a considerably lower number of qubits.


Last Question: Am I applying Landauer's principle to the quantum computer in the right way? Why do my arguments fail if they do?*




No comments:

Post a Comment

Understanding Stagnation point in pitot fluid

What is stagnation point in fluid mechanics. At the open end of the pitot tube the velocity of the fluid becomes zero.But that should result...