Researchers from an international group led by scientists from the University of Bristol announced “a new approach to quantum computing that could soon be used to perform complex calculations that cannot be done by today’s computers.”
Science Daily (with editorial adaptations from materials provided by University of Bristol), writes: “Scientists from Bristol’s Centre for Quantum Photonics have developed a silicon chip that could be used to perform complex calculations and simulations using quantum particles in the near future. The researchers believe that their device represents a new route to a quantum computer – a powerful type of computer that uses quantum bits (qubits) rather than the conventional bits used in today’s computers.”
To explain what they have, and plan for the future, we need to review some aspects of the history and physics of quantum computing.
Over the last twenty five years, improved methods of controlling and understanding quantum coherence have not only enabled experimental tests in fundamental physics, but created new practical applications such as Quantum Key Distribution. Other applications such as noise reduction using “squeezed light,” quantum lithography, and quantum computing may become possible as our ability to control quantum coherence gets better. Therefore, it will be important become conversant with the basic ideas of these new techniques.
Now we’re in the second quarter century of Quantum Computing. “It is widely believed that a quantum computer will not become a reality for at least another 25 years,” says Professor Jeremy O’Brien, Director of the Centre for Quantum Photonics. “However, we believe, using our new [photon quantum random walk] technique, a quantum computer could, in less than ten years, be performing calculations that are outside the capabilities of conventional computers.”
What is a Quantum Computer?
“Where a calculator on the Eniac is equipped with 18000 vacuum tubes and weighs 30 tons, computers in the future may have only 1000 tubes and weigh only 1 1/2 tons.” Popular Mechanics, March 1949
I once asked John Mauchley, who shared the U.S. patent on the Stored program Electronic Computer, what personal computer he had at home.
“A TRS-80,” he said.
“A trash-80? When there are much more powerful PCs on the market? Why?” I asked.
“Because, in terms of the number of registers in its processor, it comes closest of anything on the market to the UNIVAC that I invented.” [Mauchley, 1974]
The PC I used to write this article evolved — through many generations of engineering — from the early ideas of Charles Babbage. In 1812, he wrote:
“… I was sitting in the rooms of the Analytical Society, at Cambridge, my head leaning forward on the table in a kind of dreamy mood, with a table of logarithms lying open before me. Another member, coming into the room, and seeing me half asleep, called out, Well, Babbage, what are you dreaming about?” to which I replied “I am thinking that all these tables” (pointing to the logarithms) “might be calculated by machinery.”
The PC sitting in front of me is fundamentally no different from its 30 ton ancestors, some of which were equipped with some 18,000 vacuum tubes and 500 miles of wiring. Computers have become roughly 1,000,000 times more compact, and at least that much faster in performing their task, yet the task remains the same: manipulating and interpreting an encoding of binary bits into a useful computational result. The creation of the first electrical computer was an engineering breakthrough by German engineer Konrad Zuse (22 June 1910-18 Dec 1995) in 1941:
I started in 1934, working independently and without knowledge of other developments going on around me. In fact, I hadn’t even heard of Charles Babbage when I embarked on my work. At that time, the computing industry was limited to mechanical calculators using the decimal system. Punched card devices were slightly further developed and able to deal with relatively complex operations for statistical and accounting purposes. However, these machines were almost entirely designed for commercial application. This meant that mathematicians and engineers had to develop computers on their own, working independently from one another. I was no exception. At the beginning of the 30s, while studying civil engineering in Berlin, I decided to develop and build bigger calculating machines, more suitable for engineering purposes.
Before digital computers, there were analog computers (most commonly slide rules). An analog computer uses a physical process, and doesn’t really compute anything in the sense of Zuse or Von Neumann, but it produces approximate results quite usefully. Quantum computers embody a digital schema in a different way from classical computers. As a result, they compute only one result at a time (like an analog computer) but can do arbitrary digital operations (like a classical computer). Digital computers are explained in terms of the “bit.”
A bit (the term was invented by Claude Shannon in 1948) is a fundamental unit of information, classically represented as a 0 or 1 in your digital computer. Each classical bit is physically realized through a macroscopic physical system, such as the magnetization on a hard disk, a microscopic pit in a plastic DVD, or the charge on a capacitor. A document, for example, comprised of N characters stored on the hard drive of a typical computer, is accordingly described by a string of 8N zeros and ones. However, while a classical computer obeys well understood laws of classical physics, a quantum computer is a device that harnesses physical phenomena unique to quantum mechanics (especially quantum interference) to perform information processing in an inherently different way.
In a quantum computer the fundamental unit of information (called a quantum bit or qubit) is not binary in nature. This qubit property comes directly from laws of quantum mechanics that differ from laws of classical physics. A qubit can exist not only in a state corresponding to the logical state 0 or 1 as in a classical bit, but also in states corresponding to a blend or superposition of these classical states. That is, a qubit can exist as a zero, a one, or simultaneously as BOTH 0 and 1, with a complex numerical coefficient representing the probability for each state, at the atomic level. This makes current research in quantum computing not a continuation of today’s idea of a computer, but, rather, a completely new style of thought that has led to a new engineering discipline. Because quantum computers harness characteristics of qubits, they have the potential to be incredibly powerful computational devices, introducing an evolutionary leap that will be more profound than the one that saw the 30-ton mainframe monsters evolved into iPods weighing less than an ounce in around half a century.
Although it can be treated mathematically, information is not an abstract concept. Information has to be recorded and stored on media that are fundamentally quantum mechanical. That is why we extend our definition of information as merely a string of bits, 1s and 0s, to examine the consequences of the quantum nature of media to information. The implications of the qubits of quantum information theory are still being investigated and will probably surprise the entire body of experts several times before 2020. Yet to introduce Quantum Computing now, we need just a handful of quantum concepts and principles.
WHAT IS THE MATRIX? The Potential and Power of Quantum Computing
In classical computing, information is encoded in a sequence of bits, which are manipulated via Boolean logic gates arranged in succession to produce an end result (output). Similarly, a quantum computer manipulates qubits by executing a series of quantum gates, each a unitary transformation acting on a single qubit or pair of qubits. By applying these gates in succession, a quantum computer performs a complicated unitary transformation on a set of qubits in some initial state. The qubits can then be measured, with this measurement serving as the final computational result. This parallel between classical and quantum computing suggests why a classical computer can accurately simulate a quantum computer and be able to do anything a quantum computer can.
So why bother with quantum computers? Because, although a classical computer can theoretically simulate a quantum computer, it is incredibly inefficient — so much so that a classical computer is effectively incapable of performing many tasks that a quantum computer could perform with ease. The simulation of a quantum computer on a classical one is a computationally hard problem, because correlations among quantum bits are qualitatively different from correlations among classical bits, as first explained by John Bell:
The discomfort that I feel is associated with the fact that the observed perfect quantum correlations seem to demand something like the ‘genetic’ hypothesis. For me, it is so reasonable to assume that the photons in those experiments carry with them programs, which have been correlated in advance, telling them how to behave. This is so rational that I think that when Einstein saw that, and the others refused to see it, he was the rational man. The other people, although history has justified them, were burying their heads in the sand. I feel that Einstein’s intellectual superiority over Bohr, in this instance, was enormous; a vast gulf between the man who saw clearly what was needed, and the obscurantist. So for me, it is a pity that Einstein’s idea doesn’t work. The reasonable thing just doesn’t work.
A system of only a few hundred qubits exists in a Hilbert space of dimension ~1090 that, in simulation, would require a classical computer to work with exponentially large matrices (to perform calculations on each individual state, which is also represented as a matrix),. This means it would take an exponentially longer time than even a primitive quantum computer; it would take longer than the history of the universe so far.
Richard Feynman was among the first to recognize the potential in quantum superposition for solving such problems tremendously faster. For example, a system of 500 qubits, which is impossible to simulate classically, represents a quantum superposition of as many as 2500 states. Each state would be classically equivalent to a single list of 500 1’s and 0’s. Any quantum operation on that system — a particular pulse of radio waves, for instance, whose action might be to execute a Controlled-NOT operation on the 100th and 101st qubits — would simultaneously operate on all 2500 states. Hence in just one tick of the computer clock, a quantum operation could compute not just on one machine state, as serial computers do, but on 2500 machine states simultaneously! Eventually, however, observing the system would cause it to collapse into a single quantum state corresponding to a single answer, a single list of 500 1’s and 0’s, as dictated by the measurement axiom of Quantum Mechanics.
This is an exciting result because this answer — derived from the massive quantum parallelism achieved through superposition — is the equivalent of performing the same operation on a classical super computer with ~10150 separate processors (which is impossible in our physical universe).
In the 1990s, scientists and engineers searched for something interesting for a quantum computer to do. In 1994 Peter Shor, a research computer scientist at AT&T’s Bell Laboratories in New Jersey, provided such an application by devising the first quantum computer algorithm. Shor’s algorithm harnesses the power of quantum superposition to rapidly factor very large numbers (on the order of ~10200 digits and greater) in a matter of seconds. Applying quantum computers to implement this algorithm challenges the entire field of encryption, where one standard encryption code, known as RSA, relies heavily on the difficulty of factoring very large composite numbers into their primes. Intelligence agencies, corporations, governments and those interested in electronic and financial privacy are naturally interested in this type of implementation. Indeed, quantum cryptography is already a viable industry providing an alternative to RSA and other systems that operate over classical communication infrastructure, with “uncrackable” communication lines and devices being sold to government agencies and major corporations by a number of manufacturers.
Encryption is merely one application of quantum computing. Shor devised a set of mathematical operations that can only be performed on a quantum computer, and used many of these in his factorization algorithm. Feynman had earlier asserted that a quantum computer could function as a kind of simulator for quantum physics, perhaps enabling important discoveries in the field. Currently the immense power of quantum computers is a combination of theoretical speculation and preliminary laboratory development of prototypes; the advent of the first fully functional quantum computer will undoubtedly bring many new and exciting applications.
The Optical Random Walk Chip
The just announced Bristol optical chip depends on two identical photons (particles of light) moving through a network of silicon chip circuits in a quantum walk. Single photon quantum walk experiments have been done before, and[?] can be modeled exactly by classical wave physics. This is the first time a two-photon quantum walk has been performed, and the implications are futuristic.
O’Brien writes: “Using a two-photon system, we can perform calculations that are exponentially more complex than before. This is very much the beginning of a new field in quantum information science and will pave the way to quantum computers that will help us understand the most complex scientific problems.”
One strange side effect is that game players with quantum computers can find new Nash equilibria – impossible with only human brains and classical computers. I’ve discussed this with John Forbes Nash, Jr., who Russell Crowe portrayed in A Beautiful Mind. (2001). This quantum game weirdness may affect future competition between advanced nations and corporations who use quantum computers to plot their geopolitical and financial strategies.
What is a quantum walk?
As described by Casey Johnston, “a ‘quantum walk’ involves the movement of a particle as a superposition of all possible states. Until recently, quantum walks were a theoretical construct. However, according to Science, physicists in Germany are now able to make cesium atoms arranged in an optical lattice perform a physical quantum walk. Johnson:
Quantum walks were first proposed by physicist Richard Feynman and are, in terms of probability, the opposite of a random walk. A random walk might be modeled by a person flipping a coin, and for each flip, he steps left for heads and right for tails. In this case, his most probable location is the center, with the probability distribution tapering off in either direction. A quantum walk involves the use of internal states and superpositions and results in the hypothetical person ‘exploring’ every possible position simultaneously.
When a quantum walker flips a coin, it directs him to move one way, but he maintains an ‘internal state’ that moves the other way, making him a superposition of both directions of movement. During a quantum walk, as the quantum object takes more steps, it becomes ‘delocalized’ over all available positions, as if its presence is blurred.
A second feature of quantum walking is matter-wave interference, as when the person flips heads and next flips tails. The second step makes the new superposition overlap the old one, and the new superposition can either amplify the old position or remove it. After all this occurs and the desired number of steps have been taken, an attempted observation will collapse the superposition and “resolve” the object to a single position.
What does the new chip do, and where will its research groups go next?
O’Brien writes: “Now that we can directly realize and observe two-photon quantum walks, the move to a three-photon, or multi-photon, device is relatively straightforward, but the results will be just as exciting… Each time we add a photon, the complexity of the problem we are able to solve increases exponentially, so if a one-photon quantum walk has 10 outcomes, a two-photon system can give 100 outcomes and a three-photon system 1000 solutions and so on.” This is due to quantum interference between two photons.
Richard Feynman was the great-grandfather of quantum computing. Now, at last, people are about to do exactly what he proposed — they will use a quantum device to perform quantum mechanical simulations.
Quantum Profiles, Jeremy Bernstein, Princeton University Press, 1991 p. 84, quotes John Stewart Bell (1928-1990), author of “Bell’s Theorem” (or “Bell’s Inequality”).
R. P. Feynman, “Simulating physics with computers,” Int. J. Theor. Phys. 21, 467 (1982).
reprinted in A.J.G. Hey (Ed.):’Feynman and Computation’
The Feynman lectures on computation, R. P. Feynman, in A.J.G. Hey and R.W. Allen, (Eds.) [R. P. Feynman, in A.J.G. Hey and R.W. Allen, (Eds.) ] [Feynman and Post, 1983] Jonathan V. Post and Richard Feynman, “Footnote to Feynman”, Engineering & Science, Caltech, Pasadena, CA, Vol.XLVI, No.5, p.28, ISSN: 0013-7812, May 1983; reprinted in Songs from Unsung Worlds, ed. Bonnie Bilyeu Gordon, intro by Alan Lightman (award winning author of Einstein’s Dreams), Birkhauser Boston/AAAS, hardcover ISBN: 0-8176-3296-4, paperback ISBN: 3-7643-3296-4, 1985[Feynman and Post, 1990] “Starscapes” for Chamber Choir, Three Woodwinds, Piano and magnetic Tape; Composer: Van Decker; texts by Jonathan V. Post & Richard Feynman, “Footnote to Feynman”, University Music Center, California State University, Long Beach, CA, 18 May 1990 [Feynman and Post, 2005] letters between Feynman and Post were approved for: The Letters of Richard P. Feynman by Richard P. Feynman, edited by Michelle Feynman, but omitted due to the Permissions Editor failing to pass the file of documents and contracts to Michelle Feynman.
Casey Johnston, “Cesium atoms are able to take a ‘quantum walk’.” Last updated July 19, 2009 4:00 PM
Alberto Peruzzo, Mirko Lobino, Jonathan C. F. Matthews, Nobuyuki Matsuda, Alberto Politi, Konstantinos Poulios, Xiao-Qi Zhou, Yoav Lahini, Nur Ismail, Kerstin Wörhoff, Yaron Bromberg, Yaron Silberberg, Mark G. Thompson, and Jeremy L. O’Brien. Quantum Walks of Correlated Photons. Science, 2010; 329 (5998): 1500-1503 DOI: 10.1126/science.1193515
Jonathan Vos Post, Christine Carmichael, Ph.D, “Preparing the Teachers: Quantum Computing”, Proceedings of the 2005 American Society for Engineering Education Pacific Southwest Regional Conference.
Quantum Nash Equilibria and Quantum Computing
Authors: Philip V. Fellman, Jonathan Vos Post,
Comments: 18 Pages, 6th International Conference on Complex Systems.
Journal-ref: InterJournal Complex Systems, 1846, 2006
Subjects: General Finance (q-fin.GN); Computational Physics (physics.comp-ph); Physics and Society (physics.soc-ph); Computational Finance (q-fin.CP)
Claude E. Shannon, personal communication, while I carried his magnetic reed switch maze/mouse computer to his wife’s car, 1976. We discussed, at this and some follow-up meetings, his non-academic pursuits, including juggling, unicycling, the chess-playing machine he’d invented, his rocket-powered pogo stick, his rocket-powered frisbee, his flame-throwing trumpet, and his Information Theory approach to his very successful stock market investments. He met his wife Betty Shannon when she worked as a numerical analyst at Bell Labs.
Claude E. Shannon, A mathematical theory of communication. Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, July and October, 1948.
The Mathematical Theory of Communication: Claude E. Shannon and Warren Weaver The University of Illinois Press, Urbana, Illinois, 1949
Shor, P. W., Algorithms for quantum computation: Discrete logarithms and factoring, in Proceedings of the 35th Annual Symposium on Foundations of Computer Science, IEEE Computer Society Press (1994), reprinted in S. Goldwasser (Ed.): Proc. 35th A. Symp. on the Foundations of Computer Science, p.124, (IEEE, Los Alamitos, CA, 1994).
P. W. Shor, Phys. Rev. A 52, R2493 (1995).
Jacob West, “The Quantum Computer: An Introduction”, 28 April 2000
J.A.Wheeler: “Information, physics, quantum: the search for links”, reprinted in A.J.G. Hey (Ed.): “Feynman and computation”, originally published in Proceedings of 3rd Int. Symp. Foundations of Quantum Mechanics, Tokyo, p.354 (1989).