Tiffany Jing Li
Few words in academia carry as much weight as “real world.” Students seek to experience the real world and land a job in it. Professors seek to describe the real world with theories, lab experiments and computer simulations.
This can be tricky when the “real world” changes capriciously, cannot be measured, and defies logic and intuition.
Such is the case with quantum computing, a new technology that may one day relegate today’s digital computers to museums by exploiting the mercurial behavior of sub-atomic particles.
Bold promises have been made for quantum computing. The 1997 Nobel physics laureate William D. Phillips has called quantum information “a radical departure in information technology [that is] more fundamentally different from current technology than the digital computer is from the abacus.”
Others predict that future quantum computers will need just seconds to solve problems that now tie up supercomputers for years. These include puzzles in theoretical physics, the breaking of complex encryption codes, the factoring of huge numbers, and the optimization of problems with innumerable variables.
But none of this will come to pass, says Tiffany Jing Li
, until engineers develop quantum error-correction (QEC) codes that allow quantum devices to compute reliably in a world so fragile that information is destroyed or altered as soon as it is measured or read.
Li, an associate professor of electrical and computer engineering, is an expert in wireless networks, information theory, and error-correction coding for wireless data transmission and network data storage.
Four years ago, she and her students commenced a study of quantum computing, an area she describes as a blend of computer science, information theory and physics.
Since then, Li and her group have invented several new classes of QEC codes and have recorded several other breakthroughs in the field. Last year, Li received a three-year, single-investigator award from the National Science Foundation
to study QEC coding.
The domain of the qubit
Classical computers encode information in transistors as bits (bi
). Quantum computers, by contrast, encode information in electrons, photons, nuclei and other quantum states as quantum bits, or qubits.
A bit can exist in only two different states—a 1 or a 0, or on or off. A qubit can be a 1, a 0, or a “superposition” of one over the other enabling it in effect to be on and off simultaneously. Additional states are made possible by links or “entanglements” between qubits. This enables a quantum computer, in theory, to process information thousands of times faster than a classical computer.
But the “quantum world” of atoms and sub-atomic particles is extremely delicate. The slightest noise (unwanted electrical or electromagnetic energy) not only degrades the quality of signals and data but also undermines the ambiguity and the entanglements that give quantum computing its potential.
“A quantum state is extremely fragile,” says Li. “It is constantly perturbed by the environment. The root cause of this is superposition, which exponentially expands the set of available states to all conceivable superposition states.”
This fragility, coupled with other principles of quantum mechanics, frustrates efforts to achieve fault-tolerant quantum computing using the error-correction codes that work for classical computing, says Li.
In conventional computers, says Li, errors that occur during data transmission can be corrected by encoding the data to introduce redundancy, or replication. When transmission is completed, the original data is retrieved through decoding.
The “no cloning theorem” of quantum mechanics, however, does not allow replication of a quantum state, because that which is arbitrary and also unknown cannot be copied.
Another rule postulates that not only do observation and measurement change the fragile quantum state and make restoration impossible, but measurement does not reveal the original quantum state. Regardless of the original state, says Li, the measured outcome is always one of the two basic states, 0 or 1, whose probability of occurring is proportional to the power projection of the original state of these two bases.
At first glance, this dichotomy—between the unobservable state of a qubit and the observations that can be made about a quantum state—appears to preclude the possibility of QEC coding. If a decoder is unable to accurately observe a possibly corrupted qubit value (0, 1 or a superposition), neither will it be able to develop an appropriate decoding procedure, as happens in classical computing.
In 1995, three researchers—A.R. Calderbank, Peter Shor and Andrew Steane—overcame these obstacles by showing that the information in one qubit could be spread onto an entanglement of multiple qubits. Their code, known as the CSS stabilizer code for the first letters of the researchers’ last names, is the basis of most QEC codes that have been developed to date.
“The vast majority of existing quantum codes are restricted to this special formalism [CSS stabilizer codes],” says Li. “Unrestricted forms offer richer coding choices and better error correction performances, but formal methods are lacking to construct them.”
Two years ago, Li and her group invented the first class of unrestricted (non-CSS) stabilizer LDPC (low-density parity-check) codes based on classical binary codes. In contrast to previously reported design methods, many of which resulted in only a single code with a fixed rate and length, the new design developed by Li’s group revealed a rich family of codes with a wide range of rates and lengths.
More recently, Li and her group have invented several new systematic ways of constructing unrestricted as well as restricted stabilizer codes. They have also designed the first feasible quantum decoding algorithm for unrestricted stabilizer convolutional error-correcting codes, a useful class of quantum codes with flexible code lengths, and they have successfully simulated and demonstrated the codes’ error-correction performance.
“Previous researchers,” says Li, “had been able to design encoders, but not decoders, for convolutional codes. We came up with the first algorithm for designing the decoders, and we used the classical computer to simulate the first performance curve of a quantum convolutional code.”
Li and her group have described the results of their work in articles published in several IEEE journals and conferences, including IEEE Transactions on Information Theory
, IEEE International Symposium on Information Theory, and IEEE International Communications Conference.
Li’s student Peiyu Tan, a Ph.D. candidate in electrical engineering at Lehigh, won first prize in 2007 in the engineering college’s graduate student poster competition for a poster describing her work with QEC coding.
Posted on Sunday, July 19, 2009