Quantum Machine Learning: Introduction to Quantum Computation

In this guide we discuss several paradigms for quantum computing: gate-model quantum computing, adiabatic quantum computing, and quantum annealing.

5 years ago   •   9 min read

By Peter Foy

In this guide we're going to discuss several paradigms for quantum computation, in particular:

  1. Gate-Model Quantum Computing
  2. Adiabatic Quantum Computing
  3. Quantum Annealing
  4. Implementation of Quantum Architectures
  5. Variational Circuits & the Quantum Approximate Optimization Algorithm
  6. Summary of Quantum Computation

This guide is based on this course on Quantum Machine Learning from U of T, and if you want to read more about this subject check out our other guides from this course.

IBM’s Q System One | Credit: IBM

1. Gate-Model Quantum Computing

Gate-model quantum computing is also called the universal quantum computing model.

Most of the academic and commercial efforts in quantum computing today focus on this model including Google, Rigetti, IBM Q, and many others.

As we'll see, the software stack to create an algorithm for a quantum computer is different from what we're used to classically. But first, we need to understand how the gates of quantum computers work.

The diagram below shows how we can go from a problem we want to solve to computing it with a quantum processor or simulator:

Let's say we want to solve the travelling salesman problem, which asks the question:

Given a list of cities and the distances between each pair of cities, what is the shortest possible route that visits each city and returns to the origin city?

This is an NP-hard problem in combinatorial optimization, and we want to find the appropriate matching quantum optimization algorithm to solve this.

The algorithm we'll discuss later is called QAOA: the Quantum Approximate Optimization Algorithm.

The quantum algorithm then splits into a quantum circuit that has gates and unitary operations.

Below this level, the actual compilation is happening in the quantum compiler.

Once the compilation is finished you can either execute the program on a quantum processor (QPU) or on a quantum simulator.

There is a theorem called the Solovay-Kitaev theorem, which says that you can approximate any unitary operation by a finite set of gates.

This theorem has been described as:

One of the most important fundamental results in the field of quantum computation.

This is what makes the gate model of a quantum computer universal: it can take any unitary operation and decompose it into elementary gates.

So a quantum computer is universal in the sense that it can take any quantum state and transform it into any other quantum state.

Let's look at circuits in more detail.

Quantum Circuits

Quantum circuits are composed of three things:

  • Qubit registers
  • Gates acting on them
  • Measurements on the registers

Qubit registers are indexed from 0, so we say qubit 0, qubit 1, etc.

This is not to be confused with the state of the qubit, as qubit 1 can have a state \(|0\rangle\), for example.

Quantum Gates

Let's move on to gates. In a classical computer the processor transforms bit strings with logical gates.

Any bit string can be achieved with just two gates, making universal computation possible with just these two gates.

The same is also true for a quantum computer: any unitary operation can be decomposed into gates.

The difference with quantum computation is that three types of gates are sufficient (although we can have an infinite number of single-qubit operations since we can map any qubit on the surface of the Bloch sphere).

A few of the most common unitary gates include:

  • X gate - the Pauli-X or NOT gate has a matrix of \(\begin{bmatrix}0 & 1\\ 1& 0\end{bmatrix}\)
  • Z gate - the Pauli-Z gate has a matrix of \(\begin{bmatrix}1 & 0\\ 0& -1\end{bmatrix}\)
  • H gate - the Hadamard gate has a matrix of \(\frac{1}{\sqrt{2}}\begin{bmatrix}1 & 1\\ 1& -1\end{bmatrix}\)

To see more of quantum gate examples check out the Quantum logic gate Wikipedia page.

The quantum computers that we have today can only execute gate sequences with ~10-20 gates, but ideally, we want to scale this up so we can execute longer algorithms.

Stay up to date with AI

We're an independent group of machine learning engineers, quantitative analysts, and quantum computing enthusiasts. Subscribe to our newsletter and never miss our articles, latest news, etc.

Great! Check your inbox and click the link.
Sorry, something went wrong. Please try again.

2. Adiabatic Quantum Computing

We now know that the gate model quantum computer is a universal way of transforming quantum states into other quantum states.

Let's now look at a different model called adiabatic quantum computing (AQC).

Adiabatic quantum computers can also achieve universal quantum computation, and they rely on the adiabatic theorem to do calculations.

Adiabatic quantum computers are regarded as a subclass of quantum annealing (which we'll discuss next), but there are subtle differences between the two.

The Hamiltonian & Schrödinger's Equation

To understand adiabatic quantum computing we need to introduce the Hamiltonian, which is an object describing the energy of a classical or quantum system.

The Hamiltonian also provides a description of a system evolving with time.

Formally, we can express this with the Schrödinger equation:

$$
i\hbar {\frac {d}{dt}}|\psi(t)\rangle = H|\psi(t)\rangle,
$$

where \(\hbar\) is the reduced Planck constant.

As discussed earlier, it is a unitary operator that evolves its state, and this is what we get if we solve the Schrödinger equation for some time \(t\): \(U = \exp(-i Ht/\hbar)\).

The Adiabatic Theorem

An adiabatic process is when gradually changing conditions allow a system to adapt to its new configuration:

Gradually changing conditions allow the system to adapt its configuration, hence the probability density is modified by the process. If the system starts in an eigenstate of the initial Hamiltonian, it will end in the corresponding eigenstate of the final Hamiltonian.

In simpler terms, in an adiabatic process the conditions change slowly enough for the system to adapt to a new configuration.

For example, we can change Hamiltonian \(H_0\) to another other Hamiltonian \(H_1\), the simplest of which could be linear:

$$
H(t) = (1-t) H_0 + t H_1,
$$

Since this Hamiltonian depends on time, the Schrödinger equation becomes significantly more complicated to solve.

But since the adiabatic theorem says the change occurs slowly, a resulting simple Hamiltonian evolves adiabatically from the complicated Hamiltonian.

As Wikipedia describes:

First, a (potentially complicated) Hamiltonian is found whose ground state describes the solution to the problem of interest. Next, a system with a simple Hamiltonian is prepared and initialized to the ground state. Finally, the simple Hamiltonian is adiabatically evolved to the desired complicated Hamiltonian.

Adiabatic quantum computers use this phenomenon to perform universal calculations.

This paper found that adiabatic computation has shown to be polynomially equivalent to conventional quantum computing model.

Adiabatic quantum computing is somewhat idealized, so let's now move on to a less idealized version: quantum annealing.

3. Quantum Annealing

Quantum annealing is the second most common paradigm for quantum computation.

Unlike the gate model and adiabatic quantum computing, quantum annealing solves a more specific problem, and universality is not a requirement.

One aspect of adiabatic quantum computing that makes it somewhat idealized is that calculating the speed limit for change is not trivial, and can in many cases be more challenging than solving the original problem.

Instead of requiring a speed limit, quantum annealing repeats the transition (or the annealing) over and over again.

After it has collected a number of samples, we can use the spin configuration with the lowest energy as the solution, although there is no guarantee that this is the ground state.

As we can see from the diagram below, quantum annealing also has a different software stack that gate-model quantum computers:

The difference is that instead of using a quantum circuit, we encode the problem into the classical Ising model.

Quantum annealers also suffer from limited connectivity, so next we have to find a graph minor embedding - this combines several physical qubits into a logical qubit

The minor graph embedding is actually itself an NP-hard problem, so most quantum annealers use probabilistic heuristics to find an embedding.

Once we do that, quantum annealing provides is an interesting approach to an optimization problem.

The most prominent company focused on quantum annealing is D-Wave Sytems, which holds the record for the number of qubits in a superconducting quantum annealer: 2048.

4. Implementation of Quantum Architectures

Now that we've looked at a few of the common paradigms, let's discuss a few ways that these quantum computers are actually built.

Superconducting Architectures

The most popular approach is with superconducting architectures, which use silicon-based technology.

This is very useful because you can create them with the same facilities that you would use for building a digital circuit for classical computers.

The main difference is that you cool the environment down to 10 millikelvin and then use microwave pulses to control the gates and the interaction between qubits.

One of the issues with silicon-based technology is that it is fundamentally two-dimensional.

This means that the moment that we get to 5 qubits, we can't have full connectivity between every pair of qubits.

This is where the quantum compiler plays an important role.

The compiler would implement a swap gate, and this allows you to swap two qubits.

Trapped Ions

Another way we can implement quantum computation is with trapped ions.

We take individual ions, which are charged atomic particles, and line them up and trap them in an electromagnetic field.

We then use laser pulses to control their interaction.

One of the advantages of this is that they have a long coherence time.

One of the disadvantages is that the speed of control is much slower than superconducting architectures.

Scalability of the trapped ion implementation, however, is still unclear.

We currently have systems implemented with ~70 qubits, but it's unclear if it will be able to scale to thousands of qubits.

Photonic Systems

Finally, we have photonic systems which use light.

For example, you can the polarization of light as qubit states - this would be left polarized, right polarized, and the superposition of the two.

The advantage of photonic systems is that they can operate at room temperature.

One of the disadvantages is that photonic circuits have photon loss, which cannot be restored.

It's still unclear which one of the architectures will lead to large-scale quantum computing.

5. Variational Circuits & the Quantum Approximate Optimization Algorithm

We've now discussed gate-model quantum computers, quantum annealers, and how to implement them.

We also introduced how quantum computers have imperfections, and these prevent us from running many of the most famous quantum algorithms.

Variational Circuits

In the last few years, however, we have seen a new breed of quantum algorithms develop: variational circuits.

Variational circuits are designed specifically for our current and near-term quantum computers, which are noisy and imperfect.

As discussed in our article on Quantum Systems, current architectures are always interacting with an environment, and this causes things like decoherence.

What we do with variational circuits is run a short sequence of calculations on the Quantum Processing Unit (QPU) and then extract the results to a CPU.

This results in the quantum circuit being parameterized, which allows us to go back and adjust the parameters of the quantum processor and run another short calculation.

This becomes an iterative loop between QPU and CPU that improves over time.

The Quantum Approximate Optimization Algorithm

One of the most famous variational circuits is the Quantum Approximate Optimization Algorithm (QAOA), which draws inspiration from quantum annealing.

QAOA tries to approximate the adiabatic pathway, but on a gate-model quantum computer.

Recall that we're calculating the ground state of some simple Hamiltonian, and then we follow the adiabatic pathway and can find the ground state for the system we're interested in.

What we do with QAOA is break the pathway up into discrete parts, and we parameterize the circuit by having a more and more accurate approximation of the transition.

We have a transition that we want to approximate, so it is a time-evolving Hamiltonian, and this approximation is known as Trotterization.

In the end, you just read the ground state the same way you would with a quantum annealer.

So with QAOA we can formulate an optimization problem using the parameters and find the ground state of the system we're interested in.

If you want to learn more about QAOA check out pyQAOA from Rigetti, which is:

a Python module for running the Quantum Approximate Optimization Algorithm on an instance of a quantum abstract machine.

6. Summary of Quantum Computation

In this guide, we discussed several paradigms for quantum computation, including:

  • Gate-Model Quantum Computing
  • Adiabatic Quantum Computing
  • Quantum Annealing

Most academic and commercial efforts focus on the quantum gate model, but quantum annealing also plays an important role in industry today.

We then looked at the physical implementation of quantum computation. It is still unclear which architecture will prove scalable, but a few notable implementations include:

  • Superconducting architectures
  • Trapped ions
  • Photonic systems

Finally, we looked at a family of algorithms called variational circuits, which are designed for the imperfect and noisy quantum computers that we have today.

Variational circuits are classical-quantum hybrid algorithms, as they create an iterative loop between QPUs and CPUs.

One of the most famous variational circuits is the Quantum Approximate Optimization Algorithm, which draws inspiration from quantum annealing.

In the next article in this series on quantum machine learning, we're going to dive into more detail about classical-quantum hybrid algorithms.

Resources:

Spread the word

Keep reading