Experimental method of computing that makes use of quantum-mechanical phenomena. It incorporates quantum theory and the uncertainty principle. Quantum computers would allow a bit to store a value of 0 and 1 simultaneously. They could pursue multiple lines of inquiry simultaneously, with the final output dependent on the interference pattern generated by the various calculations. See also DNA computing, quantum mechanics.
This entry comes from Encyclopædia Britannica Concise.
For the full entry on quantum computing, visit Britannica.com.
Seen & Heard
What made you look up quantum computing? Please tell us what you were reading, watching or discussing that led you here.