Thursday, October 12, 2023

Measuring Quantum Progress

In August the Google Quantum AI Team posted a blog post How to compare a noisy quantum processor to a classical computer to measure progress in building quantum computers. 

So far quantum advantage (a term that has thankfully seem to replace quantum supremacy) has focused on approximating quantum distributions like Boson Sampling or random quantum circuits as described in the blog post. These results are messy, it's hard to measure success, hard to know the classical complexity of these problems, hard to explain to the public and seem to have little to no practical value.

The poster child for quantum computing has always been integer factoring. Factoring has lots of great properties.

  1. Factoring is theoretically easy to solve on a quantum computer via Shor's algorithm.
  2. While we don't know the classical hardness of factoring, considerable efforts over decades have yet to produce any even subexponential-time algorithms.
  3. It is classically easy to check the factors.
  4. It is classically easy to generate hard to factor numbers.
  5. You can explain factoring to anyone with even a moderate interest in mathematics.
  6. Factoring is critical to a number of cryptographic protocols most notably RSA.

We will achieve true quantum advantage when a quantum machine can factor numbers we cannot then factor on traditional computers alone.

So why don't we measure the progress towards quantum advantage by the size of the numbers that quantum machines can factor? More precisely factoring via Shor's algorithm for order finding followed by some classical computations to get the factors. 

Likely because we don't do very well. As far as I can tell, the largest number factored on a quantum computing via Shor's algorithm is 21. Shor's algorithm just requires a level of entanglement beyond what today's quantum machines can handle even using error-correcting techniques.

It doesn't mean factoring is the wrong measure of the success of physical quantum computers, we're just not ready to do so. 

5 comments:

  1. This is a "hard" measure of progress in quantum computing. It can't be hyped! Do you think billions of dollars will pour into quantum computing if hard measures are used to judge potential and progress?

    ReplyDelete
    Replies
    1. I agree.

      "It doesn't mean factoring is the wrong measure of the success of physical quantum computers, we're just not ready to do so."

      It is the right measure and "21" reveals how much PR nonsense quantum is doing to get money thrown at it.

      Delete
  2. Maybe Regev's recent improvement of Shor's circuit could lead to progress? (arxiv: 2308.06572)

    ReplyDelete
  3. Regarding property 2 of factoring: Subexponential integer factoring algorithms have been known for almost 50 years. See [Crandall and Pomerance 2005 - Prime numbers, a computational perspective 2ed], chapters 6 and 7.

    Quoting Crandall and Pomerance: "The quadratic sieve and number field sieve are direct descendants of the continued fraction factoring method of Brillhart and Morrison [1975], which was the first subexponential factoring algorithm on the scene."

    ReplyDelete
  4. I wonder if busting RSA is as important as everyone claims?

    My understanding is that there were quatum-safe encryption protocols even before Shor, that more have been designed since, and that there is an implementation of at least one ready to use should a QC that can factor big numbers actually appear.

    I suppose stored files that were encrypted with RSA or the like will be problematic once such a QC becomes available, but not new translactions. System managers will have to install the new software and re-encrypt their password files, and, as with every security update, some will be slow to do so, causing problems.

    But, I submit, we need something other than breaking RSA to sell quantum computing...

    ReplyDelete