22.02.2018by David DiVincenzo

Looking back at the DiVincenzo criteria

Image
Share this article

by David DiVincenzo

The first time that I heard that there were “DiVincenzo criteria” was when Richard Hughes of Los Alamos contacted me in the fall of 2001, telling me that ARDA (predecessor of IARPA – a funding agency of the US intelligence services) had commissioned him to form a roadmap committee to forecast the future of quantum information technology [1]. Before that, I just thought of them as a list that I showed in various talks and wrote down in a few essays. So the fact that they have become a “thing” is basically because some government bureaucrats found them a handy way to draw up metrics for the progress of their quantum computing programs.

The real origin of the “criteria” I can trace to an event seven years earlier, the Villa Gualino meeting in Fall, 1994 (see conference photo). This group is a pretty large fraction of all the people in the world that were interested in quantum computing in 1994. Two of those in the photo came especially that day to tell about their latest work: Peter Zoller (just behind Asher Peres) and Ignacio Cirac (mostly obscured by Peter Shor) [2].

Participants of the Villa Gualino meeting, 1994. Do you recognise everyone? See the end of this blogpost for the names.

In the hour before, Cirac and Zoller had given a presentation of their scheme for quantum computation with ion traps, a direct response, as I understand it, to ideas about quantum algorithms that they had learned from Artur Ekert (at the left of the photo) at a big international conference earlier that year. These ideas were far ahead of anything that the rest of us had conceived of, they painted a thrilling picture. My response was that their ideas, particular to their atomic physics/quantum optics setting, were a case of what could be a more general paradigm, that could work in many other fields of quantum physics.

When I began writing down the “criteria” in the next few years, several other things were going on in the field: first, papers began to be written in various specialties that paid lip service to the glory of quantum computing, then presented some (often experiment) that represented at most only the tiniest part of the picture. For them, I focused the criteria on shaking them up, and often scaring them off – don’t pretend that your impressive little experiment is actually a big step towards the physical realization of a quantum computer! Second, a big new paradigm was developed, NMR quantum computing. I felt that this was much more nearly on target, but that it would not ultimately be scalable because of its reliance on natural molecules, and, more fundamentally, its scheme for holding quantum information in highly mixed states. Thus, especially in my “canonical” version of the criteria [3], there are warning phrases here and there about “scalable” and “fiducial pure state” that were pointed at the NMR crowd.

I think that I was perhaps not completely right about the fundamental possibilities of scaling mixed-state QC; but I certainly felt that there were approaches that could be developed that would more directly get us on the right track. Coming from solid state physics, I heartily endorsed [4] superconducting and quantum-dot approaches, and it was very pleasing for me to watch in the ensuing decade as milestones directly related to the “criteria” could be achieved in these systems. The box shows the seven criteria as applied to spin qubits.

Where are we today with “criteria”? I have heard various statements, first from the ion-trap community, and from the most advanced solid state techniques, that “all criteria are now satisfied”. And still, we don’t seem to have a scalable quantum computer yet. Were “criteria” then a failure? Partially, yes: but Richard Hughes knew this would be true in 2001. In the QIST roadmaps these were referred to as “promise criteria”, meaning that achieving them all would bring us to a highly promising situation for going on further towards the technology. The “promise criteria” only get us to components that work the way they need to, in such a way that they could potentially be stuck together to form a system. But, as we see pretty clearly now, the making of the system is itself a monumental task. The guides for these steps are quite well laid out in the seven “complexity steps” of Devoret and Schoelkopf [5], and I think that these will be the more pertinent guiding principles for our upcoming future.

Participants of the Villa Gualino meeting, 1994, with names.


About the Author:
David DiVincenzo

David DiVincenzo is a theoretical physicist, still working on the physical implementation of quantum computers. Definitely looking 25 years older.

References

[1] The “QIST” study continued for quite some years after that, but came to an end around 2009. The roadmap reports can still be found here.

[2] Cirac has told me that he thinks that he was sort of “hiding” in the photo because he found some of the crowd a bit weird (particularly Tommaso Toffoli, who didn’t manage to be in the photo).

[3] The Physical Implementation of Quantum Computation, D.P. DiVincenzo, Fortschr. Phys. 48 (2000)

[4] Quantum Computation, D.P. DiVincenzo, Science 270 (1995), 5234

[5] Superconducting Circuits for Quantum Information: An Outlook, M.H. Devoret and R.J. Schoelkopf, Science 339 (2013), 6124

[6] Strong spin-photon coupling in silicon, N. Samkharadze, G. Zheng, N. Kalhor, D. Brousse, A. Sammak, U. C. Mendes, A. Blais, G. Scappucci and L. M. K. Vandersypen, Science doi: 10.1126/science.aar4054 (2018)

[7] A coherent spin–photon interface in silicon, X. Mi, M. Benito, S. Putz, D. M. Zajac, J. M. Taylor, Guido Burkard and J. R. Petta, Nature doi:10.1038/nature25769 (2018)

A Quantum Carol

by Helsen, Kroll, Rol and van Dam The phone was ringing in the lab, Sophie let it ring a ...
Read more