Filtered By: Scitech

SciTech

# In search of quantum computing’s Holy Grail

By JACQUILINE ROMERO, Ph.D.

To see a World in a Grain of Sand… Hold Infinity in the palm of your hand and Eternity in an hour.

To the Romantic poet William Blake, who penned the lines above in 1803, the computer on which you’re reading this essay would have seemed nothing less than a miracle.

And the metaphor is quite apt too: essentially made of specially-prepared sand, computer chips enable us to perform complex calculations at inhuman speeds, enabling us to grasp, albeit tentatively, the infinite and the eternal.

But it wasn’t always this way.

Around 30,000 laptops and 1 million smartphones can fit inside the ENIAC (Electronic Numerical Integrator and Computer), arguably the world’s first general purpose computer. The ENIAC was used in the 1940s to 1950s — not that long ago in the scale of human history. Those who built the ENIAC (incidentally, neither for email nor Facebook) could not have imagined the deluge of convenient computing devices that we now have. In the same amount of time that ENIAC could add two numbers, your computer could have evaluated tens of thousands of chess moves (not to mention your computer is 9000 times lighter!).

This piece is not about the history of computing, it is about (what could be) the future of computing. I warn you, it will seem crazy in some places, at which point I invite you to think of the ENIAC and be inspired to read on some more!

**The value of a bit**

Information in our computers is stored in bits. Bits can take on any of two values, for example, 0 or 1. Anything that can have two distinct states can be used to physically represent a bit, for example, an on-off switch. The ENIAC was big because it used a lot of vacuum tubes for the switches. In contrast, modern computers use transistors: very small semiconductor devices that act as fast on-off switches. The more operations you can do on more bits, the more powerful your computer is, and this is why our now-common 64-bit processors are much faster than the 16-bit processors of the early 1980s.

**Strange switches**

Of course, a bit can only have one value at a time, because as we know, a switch can only be either on or off.

But, what would it mean if bits could have more than one value, at the same time? What if you can have a switch that can be on and off at the same time?

That would probably be a useless question if no such switch exists!

That would probably be a useless question if no such switch exists!

It sounds impossible, but we can indeed find such strange switches. This strangeness is the stuff of quantum physics that has unsettled many great minds, Einstein included. The philosophical implication of having a switch that can be on and off at the same time (or more infamously, of having a cat that is both dead and alive) is huge! There will probably be no end to the debate and speculation that comes with that, but let us focus on what this means for computing.

**Qubits and quantum computing**

A string of n bits can have any one of 2^n values. As an example, 2 bits can be any one of 2^2=4 values: 00 or 01 or 10 or 11, to represent binary 0, 1, 2 or 3.

The strange, quantum counterpart of bits are called qubits. In contrast to bits, qubits do not have values in the usual sense. A qubit is a “superposition” of all the possible values. That means if you try to find out what the value of the qubit is, you will get 0 with some probability, or 1 with some probability (the probabilities add up to 1, as usual). Similarly, two qubits are in a superposition of 00 and 01 and 10 and 11, and if you try to find out the value, you will get any of these 4 possibilities with some probability. In general, n qubits, are in a superposition of the 2^n possible values. The value is not determined until after you try to find out, or more generally, after the computation.

Given the fundamental difference between bits and qubits, computation with qubits (often referred to as quantum computation) is also very different. Quantum computation entails nothing less than a paradigm shift: from the design of algorithms to its physical implementations — it is a very active area of research. Quantum computation is challenging in practice not least because qubits are notoriously fragile. Any unwanted process (e.g. collision with air molecules or absorption of heat) can cause the superpositions to be destroyed, and the computation to fail. To avoid this, qubits are often implemented at atomic scales and low temperatures to ensure they are properly isolated.

**Quantum computers in war and business**

If quantum computation is hard, why bother?

Quantum computers certainly need not be employed for all computational tasks (for adding two numbers, you’ll probably be better off using your trusty calculator). However, it turns out that for some special tasks, a quantum computer is immensely more efficient than any ordinary computer.

One example of such a task is factoring. Finding the prime factors of 15 (3 and 5) is easy, but finding the prime factors of 1,503,067 is not (they are 1223 and 1229). This sounds boring and irrelevant, until you realise that many of the encryption systems we have now (such as those used for your ATM card transactions) are secure ONLY because factoring the large integers used in the encryption protocol is practically hard (yes, that is true!). Quantum computers can do factoring of 300-digit long integers much more efficiently, hence, as one physicist puts it “it is a matter of considerable interest in the worlds of war and business”.

Another task where quantum computers can be better is in searching an unsorted database. Think of finding a name in a list that is not alphabetically listed. For an ordinary computer, the task becomes harder as the list gets longer. In the worst-case scenario, if the list is N-entries long, you need to look up N times, but if you have a quantum computer, you can manage with much less. Google should be interested!

**Current state of affairs**

Actually, Google is interested. I will take that as a vote of confidence for the future of quantum computing. Google recently bought a D-Wave “quantum”computer. Whether it is truly quantum is still in dispute (hence I put the quotation marks), but a D-Wave computer is definitely different from your ordinary computer. It is a black box, ten-feet high, occupied mostly by refrigeration. The superconductors which it uses for qubits need to be kept very cold. It is not a general-purpose computer, but still potentially very useful because it solves a range of optimisation problems (e.g. what is the shortest delivery route for a package?). D-Wave has caused a stir, in the academic and non-academic fronts, and the verdict on it will probably be out in a few more years.

But Google isn't the only player on the field: IBM, long a stalwart of computing technology, has also made its own progress in search of the quantum Holy Grail. Just this past April, the company successfully demonstrated a rudimentary way to manipulate qubits—a crude quantum computing chip.

But perhaps less controversially, quantum computers exist in the laboratory. The number 15 has been factored a few times by a quantum computer as proof of principle (by Austrian and Chinese groups, for example). They can be implemented in many physical systems, using atoms, ions, light, or superconductors, to name a few, but none of these systems is commercially available, yet.

**Why should we care?**

Not everyone may know how a transistor works, but nevertheless, the transistor ushered a revolution in computing for us all. We do not yet know how the qubit of the future will work. We might have to wait a long time as scientists and engineers figure it out, but I am optimistic that another revolution based on strange quantum physics will come. It is perfectly fine to not bother with the details, but we should at least try to be intelligent buyers of this new technology when it comes. So congratulations if you’ve reached this far, you’ve taken your first step!

**— TJD, GMA News**

For the interested: The quantum computing resources by David Mermin (Cornell University) is a gem. He has written several introductory pieces. He discusses both the factoring algorithm of Shor and search algorithm of Grover. They can be found in chapters 3 and 4 here.

*Jacquiline Romero obtained her undergraduate and masters physics degrees from the University of the Philippines. She is currently a physicist at the University of Glasgow, where she obtained her Ph.D. and does quantum entanglement experiments using light.*

More Videos

Most Popular