Google Analytics

Tuesday, August 08, 2006

Confessions of a Quantum Computing Skeptic

Will we ever have useful quantum computers? Despite the "breakthroughs" we seem to have nearly every month, we are a long way off from controlling even a handful of quantum bits certainly not the tens of thousands of qbits one needs for any meaningful computation.

But I'm not a physicist or an engineer and suppose we can overcome these obstacles and actually build a working machine. Then I can imagine the following conversation in 2025:
Quantum People: We now have a working quantum computer.
Public: Yes after 30 years and 50 billion dollars in total funding. What can it do?
Q: It can simulate quantum systems.
P: I'm happy for you. What can it do for the rest of us?
Q: It can factor numbers quickly.
P: Yes, I know. We've had to change all of our security protocols because of your machine. Does factoring have any purpose other than to break codes?
Q: Not really. But we can use Grover's algorithm for general search problems.
P: That sounds great! So we can really solve traveling salesperson and scheduling problems much faster than before?
Q: Not exactly. The quadratic speed-up we get from Grover's algorithm isn't enough the offset the slow-down by using a quantum computer combined with error correction. But we can solve Pell's equation, approximate the Jones polynomial and a few other things very quickly.
P: Are those algorithms particularly useful?
Q: No.
P: They why did you build a quantum computer?
Q: Because we could.

29 comments:

  1. Given that for so many computer scientists, quantum computing is an absurd science-fiction fantasy on a par with time travel, it's heartening to know for a few enlightened ones, quantum computers would be so humdrum that it's not even worth trying to build them.

    ReplyDelete
  2. I am equally skeptical, if not aggravated, about all of the breakthroughs and other hype concerning quantum computing. It debases the endeavor. Quantum computing is going to take a lot of patience and shouldn't be a Manhattan Project.

    But let's not throw out the baby with the bathwater. If it is one day practical, we certainly should build them "because we can". That is the only way to find out how useful they are. What we have from computer science is some limited but intriguing lower bounds, and some upper bounds in the mid-stratosphere. (It would be great to have an AWPP computer, but...) Scott's comment is much to the point.

    ReplyDelete
  3. It is important for any scientific endeavour to, at certain key points along the way, pause and take stock of what has been accomplished and what hasn't and where to go from here.

    To be fair, the same tough questions that Lance asks of QC, could have been asked about PRAM research or even computational complexity a decade ago. It boils down to: is the amount of activity/funding commensurate with the importance and depth of the results obtained, and is steady progress on track for delivery of a breakthrough---even if decades from now---at a reasonable cost/reward ratio?

    I do not know enough QC to even attempt to give an answer. Certainly from the outside looking in, looks like that the track record (thus far) of promises versus deliveries in QC research is not very good. I've heard rumbles from QC people against some of the earlier overpromisers saying that their overly enthusiastic promises now made the entire field look bad.

    ReplyDelete
  4. It seems to me that the most useful thing about QC is what it says about the general notion of computing. Is it really true that all information is physical? Are there purely physical reasons why strong Church-Turing is true? or false? Does QC challenge the full Church-Turing thesis?

    ReplyDelete
  5. I've heard rumbles from QC people against some of the earlier overpromisers saying that their overly enthusiastic promises now made the entire field look bad.

    No, they always made the field look bad, but it was never the entire field.

    Actually, the whole idea of "promises" in basic research is irritating. Either the research is interesting or it isn't. Quantum mechanics are not like auto mechanics, who promise to fix your car while you spend the day elsewhere.

    ReplyDelete
  6. "Breakthroughs" every month? Whose blog have you been reading? Has Scott been talking about experimental quantum physics again?

    BTW, I don't see the use of these field effect transistors. I mean, sure phone companies will use them, but who else? Hearing aids companies?

    And while we're on the subject what about that laser thing? What the heck am I going to use that for? Light shows?

    I'm also skeptical about this simulating quantum systems thing. I mean, what could that possibly lead to but something like understanding high temperature superconductivity enough to figure out how to raise the critical temperature to room temperature? Surely that is not worth even close to 50 billion dollars. And drug companies will have absolutely no use for quantum simulations because their current methods are certainly good enough. Nor will material scientists or chemists have use for these silly algorithms, of course. No, I think large portions of science will have absolutely no use for quantum computers.

    And of course Grover's speedups are silly, because of error correction. But what is that on the horizon? Oh, we are going to have to perform classical error correction on our classical computers soon? Oh, I hope that doesn't mean Grover speedups might actually be useful. Oh, and what is the difference between polylogarithmic overhead and a quadratic improvement? I'm always getting those two things confused, even though I work in a computer science department.

    :) Sorry, you caught me on a supersarcastic day :)

    Oh, and just to put things in perspective, 50 billion dollars/ 30 years = 2 2/3 billion per year which is off by an order of magnitude as far as I can tell.

    ReplyDelete
  7. If you're going to be so pessimistic, why not imagine the following conversation:

    Complexity people: We have proved P = NP.
    Public: Yes after 30 years and 300 million dollars in total funding. So we can really solve traveling salesperson and scheduling problems much faster than before?
    CP: No, the exponent is too high for realistic problem sizes.
    P: Then why did you bother proving P = NP?
    CP: Well, we had to justify some way to get 300 million dollars in funding.

    ReplyDelete
  8. Actually, if you are concerned about "the public" the conversation is more likely to go like this:

    CP: We proved P = NP.
    P: What's that?
    CP: Well, it means that we can now solve the traveling salesman problem on much larger problem instances.
    P: ???
    CP: We can also schedule airline flights to make optimal use of resources.
    P: Great! So airfares will go down?
    CP: No.

    ReplyDelete
  9. The last anon's comments, of course, discuss economics and not complexity...
    More to the point, there's no doubt that it's OK for some people to research QC. However, the budgeting and hype seem exaggerated to many TOC people.
    One can criticise high energy physics, TCS in general, QC, number theory or any other theory feild. Currently, however, TCS funding isn't all that good, and people feel that "if the word Quantum doesn't appear in your grant proposal - no cash for you".
    In addition, it seems many people who no little about QC (but come from neighboring feilds) think that QC => P=NP.
    What's more, TCS has developed many conceptual frameworks, and has (despite many people not thinking about it) a good philosophical background. The questions of what we humans can compute, and how long it takes, and does working in a group improve the situation, etc, are of stand-alone interest. Mind you, I'm not claiming QC doesn't have interesting questions and is just engineering, only saying for some of us these questions aren't all clear.
    The basic mode of thinking in QC is first of all almost physical, while TCS has background in philosophy of language.
    Gi.

    ReplyDelete
  10. Heh, looks like Computer Scientists have their very own version of String Theory nonsense.

    ReplyDelete
  11. 1) Someone made the comparions to
    the PRAM hype. But that raises
    the question: Was all that
    research in Parallelism and
    PRAM model worth while in the end? Do we know yet? I am asking
    this non-rhetorically,
    non-sarcastically, and perhaps
    our of ignorance.

    2) The `300 million dollars that
    went into solving QC/P vs NP/'
    will be a misnomer-- alot of the
    money went to overhead for
    the universities, was actually
    spend on research in other
    things as well, and on a variety
    of other things that have nothing
    to do with QC or P vs NP.
    I do not know if this makes that
    money more of a waste, or less
    of a waste.

    3) AI has also had a problem with
    too much hype, not enough
    results.

    bill gasarch

    ReplyDelete
  12. Was all that research in Parallelism and PRAM model worth while in the end? Do we know yet?

    Funding PRAM or QC displaces funding for other areas. So does the space dedicated to them in top conferences. The question is not if PRAM research will eventually be useful but if its usefulness justified the levels of funding and attention it got.

    In the case of PRAMs, I'd say the answer was an emphatical yes, initially lots of funding was given when it seemed the area was ripe for progress, and when our understanding deepened and it became obvious that our first take had been overly optimistic people drifted away from it. So in this case the system seems to have worked.

    From what I hear, in other cases such as string theory, particle physics or AI the system has not worked, and substantial amounts of funding continue to be channeled to areas with bad cost/reward ratios.

    ReplyDelete
  13. Currently, however, TCS funding isn't all that good, and people feel that "if the word Quantum doesn't appear in your grant proposal - no cash for you".

    Indeed funding for TCS is absolutely ridiculously low. The numbers I've seen are appalling. I would hope that your above impression is wrong about "quantum" being necessary to get a grant, at least for the NSF, but I cannot be so certain.

    I suspect, however, that the story about TCS and TCS-style quantum computing is more complicated than that a zero-sum game of quantum computing taking away TCS money. In particular those doing TCS style quantum computing are (1) a very small group (witness the number of STOC/FOCS/SODA papers which are quantum), and (2) funded primarily by groups who are motivated to build a quantum computer for just the reasons Lance poo-poos, (the ARO, NSA, DARPA (in the past), and DTO (formerly ARDA)). Most of the funding that TCS-style quantum computing people get is sluff off the backs of a huge experimental physics effort to build a quantum computer. They are not being funded for the same reasons that the government (or private industry) is funding TCs.

    This is similar, I think, to what has occured in the past for high energy physics theory, which has certainly benefited as a byproduct of the huge funding for the high energy experiment. I take it from comments I've seen here and elsewhere that a similar strategy for TCS, i.e. attaching onto more applied work, is not looked upon with a kind eye, however.

    ReplyDelete
  14. "So, Dr Shockley - your team have invented this thing you're calling a 'transistor'. And it's good for what?"

    "Well, we can make really small radios."

    "Will they replace this modern 'rock and roll' rubbish with quality music?"

    "Umm, no, not necessarily."

    "Well then! What good are they?"

    ReplyDelete
  15. You admit to not being a physicist or an engineer, yet your argument relies on your understanding in both fields.

    ReplyDelete
  16. "Well, we can make really small radios."

    "Will they replace this modern 'rock and roll' rubbish with quality music?"

    "Umm, no, not necessarily."

    "Well then! What good are they?"


    Not to put a damper on the party, but the same justification applies to, say, hacky sack. And so far all of my hacky sack-related NSF applications have been rejected.

    ReplyDelete
  17. 1) Someone made the comparions to
    the PRAM hype. But that raises
    the question: Was all that
    research in Parallelism and
    PRAM model worth while in the end? Do we know yet? I am asking
    this non-rhetorically,
    non-sarcastically, and perhaps
    our of ignorance.


    Following David Eppstein's lead on Luca's blog, I found the following very interesting article:

    Performance Scaling in the Multi-Core Era


    Apparently, Intel processors are no longer exhibiting exponential increase of clock frequency (i.e., raw speed), due to power issues. The expected way to get more processing power is to integrate multiple cores on one chip (dozens of them). So all that PRAM stuff could become practical after all.

    Piotr

    ReplyDelete
  18. So all that PRAM stuff could become practical after all.

    Well, yes and no. I gave a talk earlier this year on this subject at Dagstuhl. Among the differences are that the degree of parallelism is much lower, the model is half way between SIMD and MIMD, and the control over the parallelism is low, most likely thread level. Yet, the entire point of my talk was that this (and certain other problems) would come to dominate a good portion of future algorithmic research.

    Alex Lopez-Ortiz

    ReplyDelete
  19. The above thread, and in fact many of the more emotional discussions in this forum, is a result of the "TCS-Mathematician syndrome"; suffering from this syndrome means that we would like to think of ourselves as mathematicians doing "pure" research, but at the same time we would like to think of our results as having "real life" meaning. This leads to silly debates about whose results are just theoretical and whose results are only theoretical. Why not accept the sad truth that an extremely negligible fraction of STOC/FOCS/SODA results have any practical applications and given that, judge results according to their intellectual value and not the odds of them having any practical implications. I would think that under this way of thinking the fact that quantum computers can factor numbers is much more important than a new approximation results running in O(n^8).

    ReplyDelete
  20. Yeah, and there's only a worldwide market for something like 50 digital computers.

    Your lack of confidence is not all I imagined it to be.

    Still, I'd prefer you spent $50B of your own money working on the problem.

    ReplyDelete
  21. This scenario is too good to be true. Most probably, QC will arise, to my distaste.

    ReplyDelete
  22. Why not accept the sad truth that an extremely negligible fraction of STOC/FOCS/SODA results have any practical application

    Most STOC/FOCS/SODA papers have no direct applications, but if you look at entire chain of papers sequentially improving upon each other you'll find that many chains have made it into real life, particularly for SODA with its higher algorithmic content.


    judge results according to their intellectual value and not the odds of them having any practical implications.

    That is a sure way to irrelevance and reduced funding.

    ReplyDelete
  23. suffering from this syndrome means that we would like to think of ourselves as mathematicians doing "pure" research,

    Speak for yourself. Many theoreticians value applications and consider pie-in-sky research to be generally undesirable.

    ReplyDelete
  24. Speak for yourself. Many theoreticians value applications and consider pie-in-sky research to be generally undesirable.

    But thankfully, most theoreticians are not excessively bitter, so they actually value both approaches.

    ReplyDelete
  25. The analogy to AI here is a bit unfair. AI has had many many very useful successes. The problem is that as soon as something becomes useful, it's no longer AI. A brief list:

    Heuristic search
    natural language recognition (not perfect, but still useful)
    speach recognition
    Task planning (for manufacturing)
    Robotics

    These were all originally parts of AI...

    ReplyDelete
  26. But thankfully, most theoreticians are not excessively bitter, so they actually value both approaches.

    Given your comment, I think my definition of pie-in-sky research is quite different and a lot looser than you think.

    ReplyDelete
  27. Currently, however, TCS funding isn't all that good, and people feel that "if the word Quantum doesn't appear in your grant proposal - no cash for you".

    Lashing out at quantum computing over the atrocious funding situation in classical TCS reminds me of the classic Far Side cartoon with a bear in a gunsight, pointing smilingly at a different bear.

    ReplyDelete
  28. P: But on the other hand, here in 2025, we see that global prosperity is increasing, the global war on terror (GWOT) is ending, and the global ecosystem is stabilizing. What---if anything---did quantum computing research have to do with that?

    Q: With hindsight, we can see that quantum computing research---especially the insight that quantum model order reduction is geometrically founded upon Kahler manifolds---played a central role in revitalizing the global economy in the post-oil era.

    So nowadays, every engineering school has a Department of Quantum System Engineering. And this is a classic example of how fundamental research can have unexpectedly profound economic and strategic impacts.

    -----------------------

    Historical Background

    Rolling back the clock to the 1970s, Richard J. Barber's history The Advanced Research Projects Agency, 1958--1974 describes a situation that sounds pretty familiar: deadlocked war against an insurgency, economy in the dumper, massive deficits, global struggle against an implacably hostile ideology, increasing unrest at home.

    So Barber's history (published in 1975) is pretty downbeat -- so downbeat that DARPA never again commissioned such a history. Which is a pity, because the Barber Report is based almost wholly on first-person interviews with DARPA Directors and Program Managers (which are dynamite). To read it, you have to order your own copy (NTIS accession no. AD-A154 363; order by phone at 1-800-553-NTIS).

    And yet, as you may have noticed, the subsequent 30 years have turned out pretty good. Great for the USA, in fact.

    So how did we do it? Can we do it again? Will we do it again?

    Reading the Barber Report with the eye of a quantum system engineer, we can see that DARPA was making investments in military technology that---in combination---led to a huge economic and strategic payoff: (1) sensors (the AGILE Program), (2) networks (ARPANET), and (3) computer chips (solid-state physics). And when you put together sensors connected by networks run by chips, you get modern mechatronics, which provided an engine for US and global economic prosperity (as well as the technological foundation of US military security).

    Back in 1975, few people foresaw that the USA had a winning technological hand: the USA was mainly just lucky in being the first nation to put the pieces together.

    So, is there anything on the horizon that can provide a similar engine of prosperity and foundation of security in the coming thirty years?

    Speaking as an engineer, there is an enormous opportunity in the confluence of (1) quantum sensing, (2) nanotechnology, and (3) synoptic biology.

    These confluent technologies have a vital role to play as engines of global job creation. Because IMHO, there will be no end to GWOT until there are good, family-supporting jobs available for every young person who wants one.

    ReplyDelete
  29. Lance: Firstly, quantum simulation algorithms may need far fewer than "thousands" of qubits to solve problems that are presently intractable on classical computers. Hence ion trap prosposals in particular, and some others, aren't as far from being useful as you might think.

    Secondly, while theoretical work on quantum algorithms may be somewhat wasteful if quantum computers turn out to be pratically useless (which I think is unlikely, but for the sake of argument let's assume it is the case), much of the experimental work is interesting and useful in its own right.

    ReplyDelete