An introduction to quantum computing
Cat's Dilemma

The peculiar world of quantum mechanics points the way to a whole new kind of computer. If you're wondering how quantum computers work, we'll give you an inside view.
Quantum computing is a phenomenon that is so close to the edge, it feels like science fiction. The concepts have been around for many years. In 1980, physicist Paul Benioff described a Turing machine built to inhabit the mysterious realm of quantum mechanics. A few years later, Nobel laureate (and American physics icon) Richard Feynman, along with Russian mathematician Yuri Manin, first suggested that a quantum computer could do things that you can't do with an ordinary computer. Mathematicians and computer scientists were intrigued, and since then, they have been busily working ahead on theoretical quantum algorithms for which no suitable hardware yet exists. But the hardware vendors have been busy too, taking on the challenge of designing the quantum logic gates necessary to hold and manipulate entangled photons – at temperatures close to absolute zero. The first 2-qubit experimental quantum computer appeared in 1998, and over the past few years, billions of venture capital dollars have poured into quantum computing startups, while big companies like Google and IBM have built their own teams of quantum physicists to stake a claim on the emerging market.
Working quantum computers exist today, although they are very small scale and experimental, and they still aren't efficient enough to outperform conventional computers. But new breakthroughs occur every year, and many experts believe the age of the quantum computer is not far away. In fact, several vendors offer access to quantum computers via cloud services right now. This article introduces you to some of the principles of quantum computing – and explains why this new technology could have such a powerful effect on our future.
The Basics
The basis of quantum computing is the quantum bit (or qubit), which is both a physical component and a logical unit of information. As you already know, a classic bit can only assume the values 0 or 1. If you access and read a qubit, you also get a value of 0 or 1. Before the reading, however, a fundamentally different situation exists with a quantum bit. Because of a quantum mechanical property known as superposition, additional values between 0 and 1 are possible, leading to computational efficiencies that are not possible with conventional computers.
[...]
Buy this article as PDF
(incl. VAT)