Quantum computers could soon be running alongside traditional digital computers to help solve intense computational problems.  

The world’s first commercially viable quantum computer has been demonstrated today in Silicon Valley by D-Wave Systems, with the potential to create value in areas where problems or requirements exceed the capability of digital computing.
But D-Wave is quick to point out its new device is intended as a complement to conventional computers, to augment existing machines and their market, not as a replacement for them.
The demonstation yesterday at the Computer History Museum, showed how the new machine can run commercial applications and is better suited to the types of problems that have stymied conventional computers.
“D-Wave’s breakthrough in quantum technology represents a substantial step forward in solving commercial and scientific problems which, until now, were considered intractable. Digital technology stands to reap the benefits of enhanced performance and broader application,” says Herb Martin, chief executive officer at D-Wave.
Quantum-computer technology can solve what is known as “NP-complete” problems. These are the problems where the sheer volume of complex data and variables prevent digital computers from achieving results in a reasonable amount of time. Such problems are associated with life sciences, biometrics, logistics, parametric database search and quantitative finance, among many other commercial and scientific areas.
"Quantum technology delivers precise answers to problems that can only be answered today in general terms. This creates a new and much broader dimension of computer applications,” says Martin.
“Digital computing delivers value in a wide range of applications to business, government and scientific users. In many cases the applications are computationally simple and in others accuracy is forfeited for getting adequate solutions in a reasonable amount of time. Both of these cases will maintain the status quo and continue their use of classical digital systems,” he says.
In fact, D-Wave envisions both types of computing residing in the same machine, allowing organisations to maximise their investment in technology, while having access to both computing environments.
The idea of a computational device based on quantum mechanics was first explored in the 1970s and early 1980s by physicists and computer scientists such as Charles Bennett of IBM’s Thomas J Watson Research Center, Paul Benioff of Argonne National Laboratory, David Deutsch of the University of Oxford, and Richard Feynman of the California Institute of Technology.
D-Wave Systems is the first organisation to take the technology and package it in a commercially-viable machine, partly by using the processes and infrastructure associated with the semiconductor industry. It also uses components such as a new type of analogue processor, one that uses quantum mechanics rather than the conventional physics associated with digital processing, to drive the computation.
D-Wave’s approach allows the building of “scalable” processor architectures using available processes and technologies. In addition, its processors are computationally equivalent to more standard devices. Any application developed for one type of quantum computer can be recast as an application for the other.
D-Wave intends to deliver products to end users via a channel-marketing and partnerships with major-brand corporations with existing customer relationships and vertical-industry expertise, according to Martin.
He added that D-Wave is pursuing a partnership strategy as well to develop and deliver the software applications necessary to attract customers faced with solving the kinds of NP-complete problems for which quantum computing is ideally suited.

Background to quantum computing
Quantum computers (QCs) use quantum mechanics (QM), the rules that underlie the behavior of all matter and energy, to accelerate computation. It has been known for some time that once some simple features of QM are harnessed, machines will be built capable of outperforming any conceivable conventional supercomputer.
QCs are not just faster than conventional computers. They change what computer scientists call the computational scaling of many problems.
In 1936, mathematician Alan Turing published a famous paper that addressed the problem of computability. His thesis was that all computers were equivalent, and could all be simulated by each other. By extension, a problem was either computable or not, regardless of what computer it was run on. This paper led to the concept of the Universal Turing Machine, an idealized model of a computer to which all computers are equivalent.
We now know that Turing was only partially correct. Not all computers are equivalent. His work was based on an assumption — that computation and information were abstract entities, divorced from the rules of physics governing the behavior of the computer itself.
One of the most important developments in modern science is the realization that information (and computation) can never exist in the abstract. Information is always tied to the physical stuff upon which it is written. What is possible to compute is completely determined by the rules of physics.
Turing's work, and conventional computer science, are only valid if a computer obeys the rules of Newtonian physics — the set of rules that apply to large and hot things, like baseballs and humans. If elements of a computer behave according to different rules, such as the rules of QM, this assumption fails and many very interesting possibilities emerge.
As an example, consider the modeling of a nanosized structure, such as a drug molecule, using conventional (ie, non-quantum) computers. Solving the Schrodinger Equation (SE), the fundamental description of matter at the QM level, more than doubles in difficulty for every electron in the molecule. This is called exponential scaling, and prohibits solution of the SE for systems greater than about 30 electrons. A single caffeine molecule has more than 100 electrons, making it roughly 100,000,000,000,000,000,000,000,000,000,000,000, 000,000,000,000,000 times harder to solve than a 30-electron system, which itself makes even high-end supercomputers choke.
This restriction makes first-principles modeling of molecular structures impossible, and has historically defined the boundary between physics (where the SE can be solved by brute force) and chemistry (where it cannot, and empirical modeling and human creativity must take over).
Quantum computers are capable of solving the SE with linear scaling exponentially faster and with exponentially less hardware than conventional computers. For a QC, the difficulty in solving the SE increases by a small, fixed amount for every electron in a system. Even very primitive QCs will be able to outperform supercomputers in simulating nature.
Even more significant, as QC technology matures, systems containing hundreds, thousands, even millions of electrons will be able to be modeled by the direct, brute force solution of the SE. This means that the fundamental equations of nature will be solvable for all nanoscale systems, with no approximations and no fudge factors. Results of these virtual reality simulations will be indistinguishable from what is seen in the real world, assuming that QM is an accurate picture of nature.
This type of simulation, by direct solution of the fundamental laws of nature, will become the backbone of engineering design in the nanotech regime where quantum mechanics reigns.