**Difficulty**

*by Christian Dickel*

The general purpose programmable computer has been an enabling technology that has exceeded the original expectations in countless ways. From the humble beginnings of the original transistor, we now have devices that contain several billion transistors all working perfectly in unison in the smartphones we keep in our pocket. Our great hopes for the quantum computer are partially based on the belief that this could happen once again with the quantum computing paradigm.

The main challenge for realizing the quantum computer is certainly finding a suitable ‘quantum hardware’, that’s why it is still mainly a physics effort. However, it will also require a significant amount of computer programming and design. This makes our field interdisciplinary and soon computer scientists and engineers will likely play important roles in the further development of the quantum computer.

In this article, I aim to explain what is involved in programming for the quantum computer and give an outline of what I see as a programmable quantum computer. To keep this blog article generally accessible, I will try to first outline my argument in general terms and then later go into more detail. Schematically a quantum computer and it’s software periphery could look like this.

Building quantum computer prototypes with only a few qubits is not dissimilar to other physics experiments. In fact, many early qubit experiments were just fundamental research into quantum theory and not aimed at information processing. Now that we wish to scale to more than a few qubits, two things are changing:

1. In early qubit experiments people were building different qubit systems and tried to model their behavior. The noise plaguing different qubit types and the exact physics underlying their operation was not fully understood, yet. A lot of great science in those years was devoted to understanding the imperfections of different qubits. Currently, several types of qubits are well enough understood that we are attempting to scale them up to tens of qubits, requiring the qubits to operate more and more like an ‘idealised’ qubit that does not depend upon the implementation. This would then allow them to be more easily integrated into a specific ‘computational model’ (for the purpose of this article the computational model used is going to be the circuit model of quantum computing, but there are other models such as the adiabatic quantum computer (D-wave) or the one-way quantum computer). Matching the physical system to the model for increasing system sizes leads to a growing complexity in the tune-ups and calibrations required to make each qubit behave as expected. The physical systems where the quantum degrees of freedom reside all have parameters that need to be set, such that they will behave like a ‘qubit register’ of the computational model. Some of these parameters have to be tuned once, setting up the quantum information processor, but some have to be re-tuned regularly. These calibration processes must be automated, if we are to cross into the many-qubit regime.

2. The growing complexity in the quantum algorithms run on bigger quantum computer prototypes requires experimental physicists like me who spend our lives manually setting up quantum computations to adapt. The first classical computers were programmed using reams of punch cards but today we have compilers, higher level programming languages and operating systems. As a start, we need to set up our experiments such that the input can be a representation of a quantum algorithm. Theorists have come up with several such representations, a nice example Is QASM (short for quantum assembly). To do that, we need software that converts the quantum algorithm into instructions for the experimental hardware. The compilation of a given quantum algorithm into gates is not unique. To squeeze the maximum coherence out of our qubits, we want to ‘decompose’ the algorithm into the sequence of gates with the highest fidelity. Let’s call the program that turns the abstract quantum algorithm into hardware instructions a quantum compiler. In order to perform these optimizations we will need flexible and powerful quantum compilers. Another development will play a role here: in the early days, the experimental setup used for controlling several qubits was similar to other fundamental physics experiments in the solid state or in optics. Now, as we aim for ten or more qubits the amount of hardware required for qubit control and readout must grow also. This is natural, but this hardware is expensive and it will be necessary to harness economies of scale to keep the quantum computer financially viable. Qubit hardware needs to be simplified and standardised. This will lead to additional constraints on the available gate-sets of the quantum computer prototypes. These constraints need to be taken into account in the compiler.

What hasn’t changed is this: in all these experiments, several pieces of hardware have to work together. They are all controlled from the computer of the experimenter. This computer sends out commands and receives the measured outcomes. Therefore it needs software that can send the necessary commands and log the experimental outcomes. There is a recent effort in the qubit comunity to standardize software on this level across different labs with different qubit hardware. QCoDeS is a python based data acquisition framework supported by Microsoft which is being developed by labs in Copenhagen, Delft and Sidney. If you are working in a lab that needs new software solutions, you can contact the QCoDeS team on github. A common control software would make building a powerful compiler front end for the quantum computer a much more collaborative effort and allow developments to be shared across different quantum hardwares.

I would call a complete platform, that accepts a generic representation of a quantum algorithm as an input and then can run this algorithm on the physical quantum processor, a **programmable quantum computer**. The calibration of the hardware and the compilation of the algorithm into physical operations have to happen automatically. Currently the first such programmable quantum computers are being realized and sometimes even made available. Eventually the part of my job that is tuning up and writing hardware instructions for a few qubits by hand will be done by a computer automatically. In the not so distant future, automated quantum platforms in the cloud will allow theorists to try out their ideas themselves and hopefully inspire the development of more quantum algorithms.

## Tune-up and calibrations of qubits

What are the necessary calibrations that I was talking about? They are a precursor to any experiment. We use some hardware in our experiments and it is supposed to behave like qubits – mathematical idealizations. How do we achieve this? Let’s take electron spins as an example. An electron spin has two states and is therefore like our ideal qubit. The problem is that electron spins don’t exist alone, they are attached to electrons, particles that can move around and qubits are not supposed to do that. So people trap the electrons in place – they create so called quantum dots. This creates non-idealities that need to be calibrated. Once the electron is trapped and only the spin degree of freedom remains, one has realized a qubit, which can be visualized using the Bloch sphere:

The states of a single qubit can be visualized as different points on the surface of that sphere (here the blue vector represents the state it points to). The allowed operations are rotations which can have an angle and an axis. The operations on those electron-spin qubits can be done in many different ways, for example by applying a changing magnetic field. If one does this the right way, a spin up – let’s call it one in the computational model – can be turned into a spin down – let’s call it zero. So one could try to find the parameters for a rotation of 180 degrees around the x-axis.

Only when one has trapped the electrons just right and has the operations calibrated, one can start to use the electron spin to calculate things. But if there is more than one electron spin, it doesn’t mean that there are just two of these Bloch spheres, sadly quantum mechanics is not that easy. The Bloch sphere representation of qubit states just works for a single qubit. It also means that there can be cross-talk, meaning operations on one qubit might have residual effects on the others. Thus, the bigger the quantum computer, the more complicated this calibration process can get.

All of the steps in order to realize the computational model for a few qubits were developed by several generations of PhD students. To scale up to a many qubit quantum processor, all of this work needs to come together and happen automatically, because with more than ten qubits the attention that the experimentalist can pay to individual calibrations is very limited. We need to come up with automated calibration routines that prepare our system to be a high-fidelity quantum processor.

## Compiling a quantum algorithm to hardware instructions

Now let’s take an example to make the process of turning an algorithm into a set of hardware instructions more clear. A quantum algorithm looks like this:

This is a circuit representation of the quantum fourier transform on four qubits. It looks like a diagram, but it is really a mathematical object representing a big matrix. Each of the horizontal lines stands for a qubit and the squares and vertical lines denote operations. There are single qubit gates denoted by the H-boxes which stand for Hadamard-gates. A Hadamard gate is a rotation that swaps the x-axis and z-axis of a given qubit. Also there are two qubit gates denoted by the vertical lines with filled circles marking the two qubits it operates on. For those two qubits the $latex \left \vert 11 \right \rangle$ state picks up a phase $latex e^{i \phi}$ given by the value next to the line. The order of the logical operations is from left to right.

The diagram above describes a discrete quantum Fourier transform. The input state of the qubits on the left is mapped to the discrete Fourier transform of the input state of the right. Why would you want to use this function? In the case of the Fourier transform many people familiar with mathematics can see why it might be useful. Shor’s algorith for example uses the discrete Fourier transform to find the period of a bitstring, which in the end can be used to find the prime factors of a number. Note, that the final state is in many cases a superposition, meaning that measuring it can give a probabilistic result.

But algorithms for more complicated problems can become quite obscure to the experimentalists who realize them. Quantum chemistry for example is a field of its own and the theory behind quantum algorithms solving chemistry problems is not straightforward to hardware guys like me. However, running these algorithms just requires realizing all those gates on the qubits. Some of them might be directly available, other operations have to be constructed from the available ones. If the available set of operations allow for arbitrary algorithms to be run (in some sense an algorithm is just a complicated multi-qubit gate), we call that set of gates a complete set. For an arbitrary qubit register an example of a complete set of gates is

$latex \left \{ H_i, R(\frac{\pi}{4})_i, CNOT_{ij} : \forall \textrm{ i, j} \in \textrm{ qubit register} \right \}$

Theorists have come up with algorithms to write any multi qubit algorithm in terms of any universal set of gates, for example the Solovay Kitaev Algorithm. Sooner or later this will be a necessary higher level helper function of a quantum computer. Microsoft research’s quantum platform LIQUi|> already provides tools for compiling and optimizing quantum algorithms.

Now there is another important factor in translating the algorithm to hardware instructions: time. In the circuit diagram, the order of operations is fixed but there is no notion of time. But time is a critical parameter for the actual control pulses of the qubits, which implement these gates on many quantum computing platforms. So the experimenter needs to translate the above diagram into a sequence of pulses that implements those gates where the horizontal axis is not merely the order of the operations but truly time in some natural units of the qubits. As soon as we fix a clock cycle those diagrams are essentially scheduling diagrams, not so different from this one:

A bonus for reading the blog article this far: I’ll buy the first person who gives the name and composer of the piece of music that those lines are taken from in the comments a beer or equivalent drink of their choice. But back on topic, managing these diagrams will quickly become almost impossible: compare this two stave (those are the five lines that together with the key represent the pitch) diagram with the full score of a symphony and you get a sense of the problem. At that point, a computer could be used to verify the gate decomposition and ways to benchmark parts of the algorithm and check the timings have to be developed.

The gates that make up the universal set might have different fidelities. Partially these fidelities are given by the time the gates take compared to the coherence time of the qubits, therefore one wants to find a decomposition of the algorithm that takes the shortest possible time. However, often it can be advantageous to add additional gates – flipping qubits occasionally can filter out low frequency noise. Finding the optimal way to run a quantum algorithm on a given system is a highly non-trivial task, but our quantum computer prototypes are still limited by small coherence times, thus running algorithms in the optimal way will be essential to demonstrate computational advantages on a quantum computer.

The important take-away message is this: As we move to the stage of executing quantum algorithms, we need to create layers of abstraction in the way that they exist for normal computers. When we write down a classical computer algorithm, we do not write down the operations on the bit level. We program in high- level languages that get compiled into assembly code that is executed in the processor. Similarly, there will likely be several layers of compilation from an abstract quantum circuit diagram, to a scheduling diagram to the final concrete instructions that are sent from the experimentalists computer to the hardware.

### About Christian Dickel

Chris came to the Netherlands for the food and the weather but stayed for the quantum computer work. Apart from work he enjoys playing music with friends, ranting and soap-boxing.

Easy! It’s the 30th Goldberg Variation (Quodlibet) by Bach.

Very good. When are you back from this rather exotic timezone to collect?

I’ll be back from the beautiful New Zealand around the 15th of January!

Hi, I’m build quantum computer as freelancer. Achieve some progress did anyone know is there any way to change the binary code sequence in the microprocessor or the binary table within processor is write once read many unable to change at all.

I’m testing some changes, can anybody help out ?

Hi Heng! What kind of quantum computer do you build? I can try to ask our electronics guys in the lab if they know the answer to your problem, but I don’t understand it, yet. Would you want to change the instruction set on a microprocessor? We have used everything from arduinos over raspberry pis (or pies?) to FPGAs (and of course our desktop computers), so maybe we can help.

we need to set up our experiments such that the input can be a representation of a quantum algorithm