**Difficulty**

Perhaps you have become convinced that sharing quantum entanglement with a distant party is a useful resource. By itself, it might not allow you to communicate the weather to your grandmother, but, if pure enough, and assisted by some classical communications, it does allow you to win funny card games or, (perhaps) more importantly, to transmit quantum information via teleportation. The question is, how do we manage to share quantum entanglement with a distant party in the first place? Here, I want to discuss what are some of the challenges for establishing long-distance entanglement and a very idealized solution.

Let us consider that two distant parties, that we call (surprise) Alice and Bob, are connected via a quantum channel. A quantum channel is just a channel that allows us to transmit quantum information. The typical example of a quantum channel for connecting distant parties is a cable of optical fibre. Hence, let us assume that Alice and Bob are connected via some long optical fibre cable. Since I am a theorist, we also imagine that Alice and Bob have noise-free quantum memories available to them and, even more, they can transfer qubits from their memories to the input of the channel and store incoming qubits into the memory without any error or decoherence.

An initial strategy for establishing entanglement between them could be as follows.

Alice, prepares an entangled state locally between two qubits in her memory. Then she takes one of the two qubits and encodes it into a degree of freedom of a photon and sends the photon to Bob via the fibre optical cable. When Bob receives the photon he stores it in his quantum memory. Now Alice and Bob share entanglement. So, what is the catch? The catch is that Bob might never receive anything. The probability that Bob receives the photon scales exponentially as a function of the cable length. More precisely, let be the length of the cable, the probability is exactly:

where is a parameter, called the attenuation length, that depends on the type of fibre optical cable. A typical value for this parameter is 50km.

Let us go over a couple of examples to understand the implications of this exponential decay. First, let us observe that every 50km, the probability of getting a photon from Alice to Bob gets divided by 10. It is 0.1 after 50km, 0.01 after 100km, etc. For instance, if we try to connect Delft and Madrid which are at an approximate distance of 1500km, the probability of getting one photon to the other side would be .

Of course, we can partially compensate these photon losses by repeating the process many times. However, a quick calculation reveals that on average Alice and Bob need attempts to transmit one photon. Hence, in the case of the Delft-Madrid link, Alice and Bob would need attempts on average. Let us transform this idea of repeating many times into a protocol for entanglement distribution that we call protocol A. In this protocol, Bob sends Alice a message indicating whether he received the photon or not. In the case where it did not arrive, Alice resets her memory and tries again. Since these messages cannot travel faster than light, in the Delft-Madrid example Alice and Bob would need to wait at least x years on average to get their first entangled pair.

Protocol A can be improved by multiplexing. That is, Alice does not need to wait for Bob’s message to try again, she can generate a new pair as soon as the first photon is sent to Bob. But even if Alice can prepare entangled pairs (and store the corresponding qubits) at high rates, long distance communications are still out of reach. Let us call the rate at which Alice prepares entangled pairs the repetition rate. For instance, if the repetition rate is 1GHz, Alice and Bob would still only generate entangled pairs at a rate of roughly one pair every x years. Still not very impressive. We call this second protocol protocol B.

The solution to this problem is to place intermediate devices between Alice and Bob. We call them quantum repeaters. These devices basically exploit the teleportation trick (if you don’t remember how teleportation works you can check Jeremy’s post) to induce channels with larger attenuation lengths. Let us take a glance at how they work. Let us assume that we place a third party at half the distance between Alice and Bob. This party, we can call him Charlie, also lives in the idealized world I described above, i.e. Charlie has a perfect quantum memory and can transfer qubits from his memory to the channel and vice versa without noise or losses.

So how do we benefit from the presence of Charlie?

Consider the following protocol. Simultaneously Alice and Charlie and Charlie and Bob implement protocol B over an optical fibre cable of half the total distance.

The rate at which these protocols produce entangled pairs over half the distance is times the repetition rate. The protocols in each link are asynchronous. This implies, for instance, than when the first entangled pair is ready at one of the links, say Alice-Charlie, the other will have nothing. Since we have assumed perfect memories, this is not a problem, since the link Alice-Charlie keeps the pair stored and continues producing additional pairs. Once the other link, Charlie-Bob, produces the first pair, Charlie uses the entangled pair with Bob to teleport his half of the entangled pair with Alice. Alice and Bob end with an entangled pair as desired.

The rate at which this repeater protocol produces entangled pairs is equal to the rate at which the short links produce entangled pairs, i.e. times the repetition rate. This rate is equivalent to the one that we would obtain if the attenuation distance had doubled or if the length of the link had halved!

This idea can obviously be generalized to a larger number of intermediate stations. We can place repeaters between Alice and Charlie and between Charlie and Bob. In this idealized world, combining multiplexing (protocol B) with arbitrarily many quantum repeaters between Alice and Bob it is possible to completely eliminate the problem of losses in optical fibre.

You can imagine that we are missing many parameters that render implementing quantum repeaters extremely challenging. What are the most important parameters? To name a few, quantum memories can store at present a limited number of qubits, hence multiplexing cannot be truly exploited. Moreover, memories do decohere and the entangled pairs produced are noisy, this means that entanglement needs to be purified. There are also additional sources of losses, notably efficiencies for storing incoming qubits or for transferring them into the quantum channel are far from perfect. This implies that the entanglement distribution rate increases with the number of repeaters until a certain number, above which, adding more repeaters decreases the rate! But, if it were easy, it would not be fun, right?

David Elkouss is an Assistant Professor at QuTech. He is developing tools, such as entanglement purification protocols, novel error correction codes, quantum network protocol benchmarks or quantum network simulators that will enable the implementation of quantum networks. ~~In his free time,~~ he has a small daughter.