In August the Google Quantum AI Team posted a blog post How to compare a noisy quantum processor to a classical computer to measure progress in building quantum computers.
So far quantum advantage (a term that has thankfully seem to replace quantum supremacy) has focused on approximating quantum distributions like Boson Sampling or random quantum circuits as described in the blog post. These results are messy, it's hard to measure success, hard to know the classical complexity of these problems, hard to explain to the public and seem to have little to no practical value.
The poster child for quantum computing has always been integer factoring. Factoring has lots of great properties.
- Factoring is theoretically easy to solve on a quantum computer via Shor's algorithm.
- While we don't know the classical hardness of factoring, considerable efforts over decades have yet to produce any even subexponential-time algorithms.
- It is classically easy to check the factors.
- It is classically easy to generate hard to factor numbers.
- You can explain factoring to anyone with even a moderate interest in mathematics.
- Factoring is critical to a number of cryptographic protocols most notably RSA.
This is a "hard" measure of progress in quantum computing. It can't be hyped! Do you think billions of dollars will pour into quantum computing if hard measures are used to judge potential and progress?
ReplyDeleteI agree.
Delete"It doesn't mean factoring is the wrong measure of the success of physical quantum computers, we're just not ready to do so."
It is the right measure and "21" reveals how much PR nonsense quantum is doing to get money thrown at it.
Maybe Regev's recent improvement of Shor's circuit could lead to progress? (arxiv: 2308.06572)
ReplyDeleteRegarding property 2 of factoring: Subexponential integer factoring algorithms have been known for almost 50 years. See [Crandall and Pomerance 2005 - Prime numbers, a computational perspective 2ed], chapters 6 and 7.
ReplyDeleteQuoting Crandall and Pomerance: "The quadratic sieve and number field sieve are direct descendants of the continued fraction factoring method of Brillhart and Morrison [1975], which was the first subexponential factoring algorithm on the scene."
I wonder if busting RSA is as important as everyone claims?
ReplyDeleteMy understanding is that there were quatum-safe encryption protocols even before Shor, that more have been designed since, and that there is an implementation of at least one ready to use should a QC that can factor big numbers actually appear.
I suppose stored files that were encrypted with RSA or the like will be problematic once such a QC becomes available, but not new translactions. System managers will have to install the new software and re-encrypt their password files, and, as with every security update, some will be slow to do so, causing problems.
But, I submit, we need something other than breaking RSA to sell quantum computing...
https://metriq.info/ is a platform for tracking and sharing quantum technology benchmarks. Users can make new submissions that show the performance of different methods on platforms against tasks.
ReplyDeleteBelow the fault tolerance threshold of the surface code, the probability of an error in a logical operation scales as the probability of an error in a physical gate raised to the power of the square root of the number of physical qubits. Will improvements continue to be made to both the physical gates and the number of physical qubits? If so, the future looks bright.
DeleteUsing factoring as a progress indicator would lead to a graph of progress against time with sudden jumps in it. It would be like assessing the performance of your national soccer (a.k.a. football) team by looking at the stage at which they were knocked out of the most recent World Cup. This would be fine for someone with little interest in sport. However, if you were part of the coaching set-up, you might want to advocate for a Moneyball (i.e., data science) approach and dive into all the nitty-gritty details.
ReplyDeleteThe power output of a fusion facility has never been higher than its total power input, and fusion scientists are subject to jokes about fusion always being thirty years away. There is, however, an extremely important property of a fusion experiment called the triple product that has improved dramatically over the years (see Figure 1 of https://www.fusionenergybase.com/article/measuring-progress-in-fusion-energy-the-triple-product ). By concentrating too much on big-picture indicators (that are almost like milestones), you can miss lots of genuine progress.
DeleteJust for wider context, atomic clocks are a form of quantum technology. From the 1950s to the present day, their accuracy has improved by about an order of magnitude per decade.
ReplyDelete