I have just finished reading Decoding the Universe by Charles Seife, which tells a fascinating tale about the emergence of information theory over the past hundred years. I have read many texts on classical physics, quantum theory, and the mathematics behind modern information technology. However, while many have given good insight into their particular field, none have done so wonderful a job of tying all of these disciplines together as Charles Seife does in this text. The great scientific revolution of our generation is that of information theory and this book presents summaries of the many great theories and articles that have led to the discovery that information theory is a superset of the many classical and quantum physics theories developed over the last few hundred years.
One particular section of the book discusses quantum information, specifically the qubit (derived from quantum bit), which is similar to a classical computer software bit except is has an additional third state indicating both 0 and 1 simultaneously. Without going too deeply into the complete science behind a qubit, it is suffice to say that physicists have found techniques to place very small items such as electrons into a state known as superposition.
The discovery of superposition stems back to a classical physics question of whether light was a wave or a stream of particles. In 1801 Thomas Young devised an experiment that was thought to show that light was a wave in which he shone a beam of light through two narrow slits and observed an interference pattern upon a viewing screen beyond. Interference patterns are very typically associated with waves as the constructive (crest) and destructive (trough) nature of intersecting waves draw dark and light bands on Young’s viewing screen. However, waves propogate through mediums, that is how sound is heard on an eardrum. The wave itself is not shaking the eardrum, rather a set of air particles displaced by the sound source are moving in a wave-like formation and wiggling the eardrum. So what was a light wave moving, what material was it displacing? It was proposed that an invisible ether known as the luminiferous ether was the medium through which light propogated. It was an attempt to measure this ether that caused the development of the interferometer, a device that would cause a beam of light to be split down two paths and then re-combined before hitting a viewing scope. The paths were of equal length, however because light was travelling in two different directions along those paths, one of them should be affected by the ether more harshly than the other. Because the Earth is moving very quickly, there would have to be a natural ether wind and a light wave jostling through the direction with the wind should jostle faster than the path running perpendicular to the same wind. However, the experiment concluded that the two beams always reach the viewing scope at exactly the same time. That is to say that the light travelled at exactly the same speed irrespective of the angle at which the experiment was conducted. The experiment was even repeated at the top of a very high mountain to ensure that the original laboratory and its environment were not acting as a wind break.
It was the interferometer experiment that later led to a discovery of superposition. The same concept of splitting down two pathways to recombine later was used to send electrons down two paths to recombine at a sensor. However, the experiment was attempted with only a single electron. It was expected that the electron would pick one of the two paths as such a particle is indivisible and cannot be split. However, the first run of the experiment seemed to indicate that the electron took both paths at once. The viewing scope at the end of the path showed an interference only possible if two electrons had collided and interfered with each other simultaneously. “Faced with two mutually exclusive choices, the electron chooses both.”. Even more amazing was the result of placing a laser along one of the paths and repeating the experiment. It appeared that the electron chose both paths only until it was measured. At that point the electron definitively chose only a single path, either by being observed by the laser on the path being watched or by not creating an interference pattern at the end and hitting the end sensor; thereby indicating it chose the unwatched path exclusively. In quantum mechanics, the electron is said to be in superposition. In another quantum term, the electrons upon the two paths are said to be entangled (see entanglement) meaning that when one “chose” to exist the other simultaneously chose not to. It is the two concepts of superposition and entanglement that led to quantum computing.
It was later found that other particles could be forced into other types of superposition. For example, atoms could be manipulated in such a way that they are both spin up and spin down simultaneously (mutually exclusive states for an atom) and hence be placed into superposition. This brings us back to the idea of a qubit. In the same way that a classical computer bit (0 or 1) could be stored in silicon memory, a qubit could now be stored in one of three states by an atom being either spin down (0), spin up (1), or both simultaneously (0 & 1). It was this principle that was used to build the first quantum computer. In 1994, Peter Shor developed Shor’s Algorithm for factoring a number N in O((log N)3) time and O(log N) space. The concept exploited the idea of superposition, of a particle being in two states at once, and of forcing the particle to “choose” a state (along with the related concept of entanglement). In the most basic terms, the quantum computer set up qubits using atoms in both spin up and spin down states simultaneously to model its virtual memory, which is similar to how classical computer science would set up memory with bits as 0s and 1s. Then it exploits the idea that all of these qubit carrying particles are entangled to test all possibilities of their combination of states at once. With some mathematical massaging, the probabilities of these qubits to “choose” a particular state (and therefore force their entangled compliments to choose the opposite state) are tickled in a couple of very hairy math passes until the resulting “chosen” states of the atoms reveals the “chosen” answer. This is a vast over-simplification of the process, but the experiments performed using these theories get the correct results in vastly fewer operations than would be required by a classical computer. Formally, it has been found that any algorithm requiring n steps on a classical computer would require root n steps on a quantum computer. Not a big speed increase when considering a four-step classical process takes only two on a quantum computer, but with the thought that a process requiring 256 operations classically takes 16 quantum steps, or that 2,304 steps take only 48 on a quantum computer, the speed increase becomes clear very quickly.
In 2001, in an experiment performed by Isaac Chuang, a 7-bit quantum computer correctly factored the number 15 into 5 and 3. While not impressive on the surface, the fact that a quantum computer was able to factor a number using Shor’s algorithm has some interesting ramifications. While building quantum computers on larger scales is still not yet a possibility, were a 256-bit quantum computer actually constructed, it could perform the kind of brute-force attack that would take a classical computer an eternity in a matter of minutes. It has been calculated that the most powerful classical computer in existence performing billions of operations per second would need to run for the entire lifetime of the universe to brute-force crack modern cryptographic keys. However, a 256-bit quantum computer could perform the same attack in just a few minutes, rending modern cryptography wide open.
Of course, quantum computers cannot yet be built on that scale, nor have the theories of scaling to such sizes ever been field tested. Furthermore, the technology required to build such a machine is unimaginably more difficult to obtain and construct than a modern home PC, so the cyphers used to secure internet transactions are more than safe for the present time. It is, however, quite fascinating to think that such computational power could one day become available. Projects such as cancer research, genome mapping, and even SETI would be revolutionized if they could harness the kind of power that quantum computing is predicted to possess.
If you’re interested in learning more about information theory or quantum computing, I can highly recommend both Decoding the Universe and Charles Seife’s other book Zero, which I read a couple of years ago. Seife has an excellent way of wording concepts that are otherwise completely over my head without abstracting so much information as to neuter the subject in question.