Thursday, August 26, 2010

Cryptography if P = NP

Bill is at Barriers II in Princeton and promises a full report upon his return.

Ask many computer scientists what happens if P = NP and you'll get the response that it will kill cryptography. Let's ignore solving all optimization problems, solving AI and learning everything, likely cures to many diseases and many more amazing advances in society, what a disaster it will be that we can't send secret messages to each other.

I don't at all believe P = NP but let's have this hypothetical discussion of what happens to cryptography in that world. We do lose public-key cryptography, the ability for two people who have never met to electronically still send encrypted messages to each other. How will I send my credit card info to Amazon if P = NP?

Amazon will send me a USB key containing a gigabyte worth of a one-time pad, enough to encrypt all my transactions for my entire life (even though I will live much much longer thanks to P=NP). This might work for Amazon but what about tiny mom and pop web stores? We just need a trusted site for a one-time pad registry so my USB key will work for any Internet transaction. Even for public-key cryptography today we need trusted parties to verify sites so we haven't lost much here.

For large financial institutions they can ship huge amounts of one-time pad data through secure couriers (or just ship data this way). Large amounts of compressed data can have a very small footprint these days.

What about cryptographic protocols? Zero-knowledge is uninteresting if P = NP. Many secure multi-party computations can be done with private channels which just need private keys. There are many others, electronic money, secure voting and digital signatures come to mind, that seem difficult to do if P = NP. On the other hand they are also hard to do in practice today even under the usual assumptions.

Certainly P=NP will make cryptography considerably more painful that it is now and many things we thought previously encrypted won't be anymore. But we'll figure out how to meet our crypto needs in any environment. The loss of public-key cryptography shouldn't be our first gut reaction to what happens if P = NP.

25 comments:

  1. Everything you say sounds nice in theory, but I don't buy it in practice.

    So Amazon is going to send me a USB with a huge one-time pad?! You mean the same way my bank securely gives its public key to me on a USB when I sign up for an account? (Hint: they don't.) And I'm going to carry around multiple USBs for every large company I interact with?

    And smaller 'mom-and-pop' stores are going to rely on an Internet-wide key distribution center (KDC) that everyone will trust and is going to send USBs to 75% of the US population?

    And active attacks are going to be prevented how, exactly?

    One technical mistake:

    Large amounts of compressed data can have a very small footprint these days.

    Of course, if you're talking about using a one-time pad then compression is out of the question...

    ReplyDelete
  2. "Let's ignore solving all optimization problems, solving AI and learning everything, likely cures to many diseases and many more amazing advances in society"

    Can you be more specific? Are AI solutions, learning and disease curing computations in NP (i.e., is a good solution verifiable by a computer in practice)? To be honest, I don't really see a killer application that will change the world.

    ReplyDelete
  3. I think the credit card companies would be the ones to send you the one time pads. That would resolve the mom and pop issue.

    Each bank would provide a guarantee about security, just as they can guarantee you against fraud. [They want to stay in business, after all; which they would lose if people were too afraid to use it.]

    But what about Pessiland? We could have P!=NP and still have no crypto.

    @Johnathan: Lance's point was about institutions sending regular data to each other, not one time pads to each other.

    ReplyDelete
  4. Barriers II is shaping-up to be a terrific conference from many perspectives, and in particular, I have posted a systems engineer's homage to its themes on Dick Lipton's blog.

    And I would like to say, that the praise rendered to Dick Lipton's (and Ken Regan's) weblog posts, applies similarly to Lance's and Bill's posts.

    Namely, by broadening our collective mathematical understanding, weblogs like this one are accomplishing (IMHO) a great deal to make every conference better.

    The vital role of a shared, broad mathematical base is pretty plainly evident in every act of this summer's mathematical three-ring-circus: Act I being Deolalikar's claimed proof; Act II being the ICM awards, and Act III being the upcoming Barriers II conference.

    All three acts of this wonderful mathematical circus IMHO deserve our enthusiastic applause ... and our sincere thanks to its weblog ringmasters.

    Dick, Ken, Bill, Lance, Terry, Tim, Scott, Dave, Gil ... gosh ... you guys have deservedly emerged this summer as first-name celebrities ... because the role your weblogs are filling is vital and definitely not easy.

    ReplyDelete
  5. We need trust centers only for verification right now: Alice owns public key X. A compromised trust center is only able to allow Eve to pretend to be Alice.

    But now the trust center has to have knowledge of private keys. That means besides being able to spoof messages, they can decrypt anyone's message. Using a trust center, any communication would ultimately be decryptable by the government (it'd also be easier to inject messages as opposed to having the spoofed person uninvolved).

    As to why it's always brought up if P=NP? Probably because it's so binary: public cryptography does/doesn't work. Everything else is gradations: we could optimize things better, we could write smarter AI, we could speed up disease research: these things are all happening already, it's just a matter of degree.

    ReplyDelete
  6. This comment has been removed by the author.

    ReplyDelete
  7. On a more humorous vein, our first "gut reaction to what happens if P=NP" obviously should be "There goes personal privacy!"

    Because in a world where everyone is (in effect) a transcendentally intelligent Sherlock Holmes, the following discussion would be commonplace:

    "Former RNC Chairman Ken Mehlman is secretly gay? That's not news ... heck, my laptop deduced it years ago ... from his choice in ties."

    We thus appreciate that in a world of P=NP (or any capability approaching it) human existence would be reduced to a ruthless battle for information-dominance, with privacy effectively nonexistent, and every citizen held thrall to what many folks nowadays praise, but (logically speaking) all should abhor: rational, efficient markets.

    Which ... uhh ... maybe helps us appreciate why computer-assisted trading, in the nominal service of market efficiency, has so very often in recent decades proven to be a terribly bad idea? :)

    ReplyDelete
  8. Is AI in NP?

    I mean… we do tend to think that we have intelligence?

    And this wetware computer I call my brain, given some sensory input, somehow does manage to generate at least some realtime output that seems to be coherent enough to be mistaken for intelligence?

    Does this then not imply that intelligence is do able in realtime?

    ReplyDelete
  9. You don't have to posit that AI is in NP to see the benefit to AI of P=NP. Let's take computational linguistics as an example of an "AI-complete" problem. I have a large corpus of natural language sentences and I need to find the smallest model that explains them (I'm being a bit loose here but I think you know what I mean). The model space is huge; currently I have to use my human intuition about language and the task I'm working on to decide which parts of it to explore in detail. You can look at the proceedings of ACL/COLING to see how different tasks and different people's intuitions lead to different models.

    But if P=NP (with an efficient algorithm for SAT), I can search a much larger subspace of models in much greater detail much more efficiently. The problem may not be in NP, but for most intents and purposes it's solved.

    ReplyDelete
  10. There's also the possibility that P=NP, but the exponent is large.

    An O(n^3) encryption method which requires Omega(n^30) to crack is secure. Such, encrypting a credit card would take 2 minutes instead of 2 seconds, but so what. PKE would be fine.

    ReplyDelete
  11. This speculation seems a little silly to me. What if someone proves P=NP, but the proof is totally non-constructive? Just because P=NP doesn't necessarily magically *give* us polynomial time algorithms for everything in NP! And even if it did, there's no guarantee they'd be any good in practice: they could run in O(n^999999999999) for example.

    ReplyDelete
  12. As a practitioner working with n big enough that O(n^2) algorithms are out of the question, and O(n log n) is pushing it, the proclaimed practical advances seem really overstated here. I feel like computational complexity folks use polynomial time because it is very convenient (machine independent, etc.) but completely forget about how poor a representation of the idea "tractable" it really is.

    ReplyDelete
  13. even if P=NP, the reduction is constructive and the exponent is small, there's still the possibility that the constants that come up are so huge that computational crypto is still perfectly practical. I remember that after Roberson & Seymour proved their celebrated graph minor theorem some people hoped the proof might extend to P=NP (unfortunately it didn't quite work out). the running time of the polynomial time algorithms that were constructed using the graph minor theorem have such gigantic constants that they are completely unpractical already for tiny inputs (or, to put it into a cryptographic context, security parameter.)

    krzysztof

    ReplyDelete
    Replies
    1. I was about to write something similar... somehow people think P=NP means that everything is now as easy as sorting. Constants matter as does the degree of the polynomial. If you have a break to my cryptosystem that runs in N^{10000} time, I'm probably ok.

      Delete
  14. Eagerly waiting for Bill's report!

    Well, if P=NP, we can write a program to tell us what to do to communicate securely! ;)

    Actually, I asked this question a few times from cryptographers (half jokingly), and their answer under such an unexpected outcome was "we will move to Quantum Crypto".

    ReplyDelete
  15. Hopefully Bill will weblog too about Terry Tao's near-proof that P=NP.

    Uhhh ... you say you missed this announcement?

    Well, as we all know, this week's ICM saw the award of four Fields medals, the Chern Medal, the Gauss Prize, and the Nevanlinna Prize ... seven major awards in all.

    Regarding these seven awards, Terry modestly remarked on his weblog that "by chance, the work of [6-of-7 ] recipients overlaps to some extent with my own areas of expertise."

    Hmmm ... now what are is the probability that a 6-of-7 overlap would occur "by chance"?

    Assuming that the cognitive processes of even Terry Tao are in the class PSPACE and PTIME, it's pretty clear that a 6-of-7 overlap has finite probability only in a world that possesses sufficient mathematical structure that ... well ... P=NP! :)

    Now, the preceding is non-serious (obviously), and yet perhaps there's a complexity-theoretic heuristic in it. Namely, mathematical capabilities that are broadly associated to ideals of mathematical naturality are (empirically) increasing at a pace that is allowing the understanding of at least *one* mathematician to keep up (reasonably) with a mathematical literature whose global volume is blowing-up.

    So, is Terry Tao unique? Or can his capabilities be taught and grasped by mortal teachers ... and their students?

    It's a bit heretical to say so ... but perhaps the some of the most significant "Barriers in Computational Complexity" were in our wetware all along ... and now (as in every previous century) we are finding new ways to surmount these barriers.

    The Clay Institute website hosts a page "Interview with Research Fellow Terence Tao" (2003) that is Tao's description of "How I do it" (to borrow a phrase from Young Frankenstein); this page is highly recommended. If Terry were to update this interview, perhaps at intervals of 10 years or so ... well ... that would be terrific.

    ReplyDelete
  16. You are right that the one-time pad may be used for authentication and encryption in some settings even if P=NP but as Jonathan said, it's going to be quite problematic.

    In particular the loss of digital signatures will probably hurt more than the loss of public key encryption. It's not just online commerce, but also getting software patches etc..

    One generic problem with the one-time pad is that if some virus etc.. manages to read this USB key then it can not only read but also modify all communication to you from now on. This is in contrast to the case where your computer only contains the hardwired public key of some certificate authority, and reading it doesn't help forging communication from anyone.

    (On the other hand, maybe compuer viruses will be extinct since software will be perfect...)

    ReplyDelete
  17. I was (and am) under the impression that crypto is a more compelling application of why we want to prove that P is not equal to NP. If we thought P might equal NP, then obviously people would be clamoring to know the breakthrough algorithm. But since we think not, we need much better arguments to justify going to herculean lengths to verify P not equal to NP. To play devil's advocate, who really cares? Sure it would be nice to know, but it seems a tad esoteric and only of theoretical interest.

    Enter crypto as case #1 for the practical benefit of PvsNP: "Provable Security"

    That's my best guess for why things get spun the way they do.

    ReplyDelete
  18. Anonymous asserts: We need much better arguments to justify going to herculean lengths to verify P not equal to NP. To play devil's advocate, who really cares?

    I'm happy to present the case that "Engineers really care about P versus NP".

    As Dick Lipton says this week “I believe that the more ways we can define and view a mathematical topic, the better is our understanding.”

    Dick is of course echoing Terry Tao, Mac Lane, Feynman, von Neumann … he is echoing pretty much everyone who has ever written about the STEM enterprise. And via his weblog, Dick doesn’t just “talk-the-talk”, he “walks-the-walk” … his weblog’s long-standing commitment to mathematical diversity is wholly admirable IMHO.

    Let's take the point-of-view that, broadly conceived, computation is just another name for simulation. Then the question "Is P=NP?" naturally links fundamental mathematics to what is at present the most rapidly expanding—and actively hiring—sector of the global economy, namely, simulation-driven systems engineering.

    Economically speaking, this expansion is at present the over-arching “Moore’s Law” of the STEM enterprise, and so it is prudent to respect it … to take advantage of it … and to accelerate it (if we can).

    Heck ... how else are jobs going to be created?

    Obviously, strict uniformity of opinion is not necessary, even with regard to fundamental issues like “What mathematical questions are natural? What mathematical methods are natural?” Because even if uniformity were locally achieved across any one STEM discipline, the overall consequences would be bad for the global STEM enterprise.

    The point, then, is that engineers do care about P versus NP ... but the answer is the least valuable aspect of the problem. More valuable by far are the emerging proof technologies that are associated to the question. Most valuable are the evolving ideals of naturality in engineering, that we derive by interacting with mathematical community.

    What’s terrific about the STEM blogosphere (IMHO) is that it’s doing a wonderful job of collegially sustaining these cross-disciplinary creative tensions. We can be optimistic that in the long run, our whole planet is going to benefit immensely from this.

    ReplyDelete
  19. I'd like to second Jeremy Hurwitz's comment:

    > There's also the possibility that P=NP, but the
    > exponent is large.
    > An O(n^3) encryption method which requires
    > Omega(n^30) to crack is secure. Such, encrypting a
    > credit card would take 2 minutes instead of 2
    > seconds, but so what. PKE would be fine.

    The point bears repeating: polynomial time is too crude a notion even rule out (or confirm) the existence of practical applications of computational hardness. Of course, if we had a quasi-linear-time algorithm for SAT with reasonable constants (and there isn't one that runs in time 6n?) then, as Lance points out, we'd be reduced to other techniques. But even if SAT is hard for sub-quadratic-size circuits, some non-trivial crypto is possible ("Merkle's scheme").

    More generally, the set of possible worlds a la Implagliazzo (cryptomania, pessiland, and co.) becomes even more varied and complicated if one considers finer-grained complexity classes.

    ReplyDelete
  20. Care to expand on that, Adam? I'm curious about the sub-worlds. :)

    ReplyDelete
  21. Hello, I am an undergraduate CS student, knowing only the fundamedals of complexity theory.

    Let's assume P = NP. But isn't PSPACE the class of problems that given a solution, it can be verified in NP. But now this can be done in P. Therefore, PSPACE = NP = P . ( which also means that LOGSPACE != P).

    This leads us to a magnificent conjecture : "If the result's length is polynomial, relative to the length of the input, it can be computed in polynomial time".

    Protein folding? Solved.
    Stock markets? Even more boring and predictable.
    A vast number of complex systems? Solved.

    So what if it takes 2 years to optimize eye surgery. The job will be done, once and for all. The same goes for any complex system.

    I just can't understand how people can't see the benefits, given that P=NP. Those benefits appear so great to me, that make me prejudice towards equality in this grand problem.

    ReplyDelete
  22. Jeremy writes:

    There's also the possibility that P=NP, but the exponent is large.

    An O(n^3) encryption method which requires Omega(n^30) to crack is secure. Such, encrypting a credit card would take 2 minutes instead of 2 seconds, but so what. PKE would be fine.

    --

    In my mind, the issue is this. We don't have good reason to believe the Church Turing Thesis respects the exponent -- it is not tight up to constants, and we can easily find "reasonable" abstract models of computation which can only simulate eachother up to polynomial factors. If P = NP but the exponent is large, for the RAM model say, we wouldn't really have any security guarantee at all. Someone would need to demonstrate conclusively that Chuch Turing holds for substantially more than that exponent, or else we would not be able to rule out that some other bizarre computational medium could actually implement the algorithm in say, near linear time. Such an algorithm would mark the start of a new hardware race, where manufacturers try to build computers specifically tuned to running this algorithm, possibly using massively distributed nets, possibly using DNA computers, who knows.

    It seems it would essentially turn the problem from being a theory problem to being an engineering problem, and the only way to prove a meaningful lower bound / security guarantee would be to argue for a tighter church turing thesis.

    ReplyDelete
  23. This was bang on, and hilarious too. Look at all you complaining and trying to argue with it. Some of you saying it wouldn't make any difference, some saying it would be a detriment. I say its significance couldn't possibly be understated. We're talking a digital philosopher's stone here. Forget about Moore's Law, man would be farther developed in 10 years with it than 1000 without it. I would say that the top 5 most important discoveries to date would then be 5: Electricity. 4: The wheel. 3: Soap. 2: Fire. 1: A polynomial time algorithm for solving NP complete problems.

    Gideon: Intelligence may be in NP. If it is, that would mean P=NP, and we've all got a NP problem solver innately built into us, we're just not good at adapting it to solving any old NP problem as it is hardwired into us to make us good at the things that were necessary for survival on average over the last few million years. This in my opinion is the only actual logical evidence that P might equal NP: We don't understand the algorithms that run in our own heads. How long was the proof to fermat's last theorem? Pretty long, right? If P!=NP and that was an innately hard problem to solve, finding the proof should have taken far longer than the universe has existed or astronomical luck. Yet it was solved by a human. An incredibly long proof, easily enough verified, but apparently not impossible to produce because it was produced. If P!=NP, then how did we do it? We shouldn't have been able to find the proof.

    ReplyDelete