tag:blogger.com,1999:blog-3722233.post4483403516486740633..comments2024-02-29T15:59:22.700-06:00Comments on Computational Complexity: Do Uniform Lower Bounds Matter?Lance Fortnowhttp://www.blogger.com/profile/06752030912874378610noreply@blogger.comBlogger15125tag:blogger.com,1999:blog-3722233.post-87833215336836119392010-12-11T10:14:07.170-06:002010-12-11T10:14:07.170-06:00Anonymous, one nice thing about our exchange, is...<i>Anonymous</i>, one nice thing about our exchange, is the many opportunities it is affording for mentioning classic references relating to uniformity.<br /><br />These references are full of useful information and (sometimes) delightful historical surprises too. <br /><br />With regard to special functions, the polynomial and exponential functions have uniformly convergent power series, and so (naively) we might expect that the Lambert function defined by <i>W(z) exp(W(z)) = z</i> will be uniformly convergent too. But as the classic 1996 article (>1400 citations to date) <i>On the Lambert W Function</i> by Corless, Gonnet, Hare, Jeffrey, and Knuth, reminds us, that's not so ... <i>W(z)</i> has a branch cut at <i>z</i>=-1/2.<br /><br />As a bonus, Knuth and his colleagues take the trouble to include an historical appendix briefly describing the life of Johann Heinrich Lambert ... the <i>acutissimi ingenii Lambert</i> (the ingenious engineer Lambert) as Euler called him:<br /><br />——-<br />Lambert had dreams [in the 1760s] of building a machine to do symbolic calculation … Thus his wishes anticipated those of Lady Ada Lovelace by roughly one century, and the actuality of symbolic computation systems by roughly two.<br />——-<br /><br />Such history-of-computation is interesting (and fun!) to me and many folks ... I'm grateful that Knuth and his colleagues included it. <br /><br />And there is a mathematical lesson too: contrary to our instincts, naturality and uniformity do not always go hand-in-hand; thus the naturality of the definition of <i>W(z)</i> non-intuitively <i>requires</i> non-uniformity in its power series representations.<br /><br />By consensus, special functions are defined as "any function about which one can find information in many books on special functions", and it seems plausible (to me) that future books on special functions will include the informatically natural <a href="http://faculty.washington.edu/sidles/Litotica_reading/sqrt(exp).png" rel="nofollow">half-exponential function</a> <i>h(h(z)) = γβ^z</i>. <br /><br />As with the Lambert function, the half-exponential function is constructed from the uniform (entire) functions <i>z</i> and exp(<i>z</i>), and yet <i>h(z)</i> itself has no uniform power series expansion; rather the analytic stricture of <i>f(z)</i> turns out to be wholly controlled by the analytic structure of (what else?) <i>W(z)</i>. Thus again we see that that respecting mathematical naturality in regard to half-exponential functions requires that we give up some notions of uniformity.<br /><br />Particle theorists too have learned—the hard way—that sometimes naturality requires the sacrifice of uniformity. An instructive document in this regard is Geoffrey Chew's 1970 <i>Hadron bootstrap: triumph or frustration?</i>, which makes a passionate argument for (in essence) the priority of uniformity: <br /><br />---------<br /><b>Chew:</b> <i>I would find it a crushing disappointment if in 1980 all of hadron physics could be explained in terms a few arbitrary entities [meaning, quarks and gluons].</i><br />---------<br /><br />For better or for worse, though, what Chew feared is (uncannily) what happened.<br /><br />These considerations lead us naturally to enquire: is present-day complexity theory making a mistake similar to the <i>S</i>-matrix theorists in the 1960 and 1970s? That is, by <i>not</i> distinguishing among algorithms in <i>P</i> on the basis of verifiability, is complexity theory lumping together algorithms that are similarly different ... and similarly hard to separate ... as quarks and gluons?<br /><br />With regard to these questions, relating broadly to the proper balance of naturality versus uniformity, it seems good to me that there is (presently) *not* a narrow consensus, but rather a broad spectrum of opinions ... because this encourages everyone to read (and synthesize) wonderful articles in math, science, engineering, and history.<br /><br />Happy holidays, and happy STEM reading to all! :)John Sidleshttp://www.mrfm.orgnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-35751298710405350192010-12-10T14:42:50.680-06:002010-12-10T14:42:50.680-06:00Yes, I might perhaps acknowledge that Peter Shor s...Yes, I might perhaps acknowledge that Peter Shor sometimes also makes comments nearly as interesting as those of John Sidles. But perhaps if John Sidles had his own weblog, it would allow a little more room for Peter and co-luminaries to post on existing blogs, rather than having every discussion sidetracked into P-gluons, compressed sensing, and such. It is sobering to reflect that many of the best give short, terse comments.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-83115016835174306902010-12-10T04:00:18.071-06:002010-12-10T04:00:18.071-06:00Anonymous asserts: Tim Gowers, Gil Kalai, John Pre...Anonymous asserts: <i>Tim Gowers, Gil Kalai, John Preskill, and Geordie Rose also sometimes make nice comments too.</i><br /><br />Surely, it is "a truth universally acknowledged" that names like Michael Nielsen, Peter Shor, Leonid Gurvitz, Charles Bennett, and Barbara Terhal (among many) might credibly be added to that list. Perhaps our confluent expression of appreciation will encourage these people to post more frequently.<br /><br />When one contemplates the size of our planet's population, the scale of the challenges it faces, and the crucial roles of complexity theory in meeting those challenges ... well ... it is sobering to reflect that this list is rather short.John Sidleshttp://www.mrfm.orgnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-69330988985555782242010-12-09T12:52:06.483-06:002010-12-09T12:52:06.483-06:00I agree with the previous anonymous. John, these ...I agree with the previous anonymous. John, these comments are so long that one has to scroll through pages to get to anyone else's comments, especially on Scott's blog where a few other people, such as Tim Gowers, Gil Kalai, John Preskill, and Geordie Rose, also sometimes make nice comments too. So, I think your comments should be blog posts instead.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-18134293322978729872010-12-09T05:55:10.715-06:002010-12-09T05:55:10.715-06:00Anonymous, the world is abundantly blessed with ou...Anonymous, the world is abundantly blessed with outstanding weblogs ... the world is less abundantly blessed with thoughtful comments on those weblogs. <br /><br />So you and I (at least) can both try to do our best in this regard. As "anonymous", your responsibility is particularly strong ... after all, "anonymous" posts more than everyone else combined.<br /><br />In regard to practical applications of complexity theory, I am old enough to remember, in the 1970s, how agitated some <i>S</i>-matrix theorists were by the new-fangled concept of quarks and gluons. <br /><br />Quantum fields that are never instantiated in Nature as free particles? ... fields that nonetheless are fundamental? This postulate was viewed as preposterous by committed <i>S</i>-matrix theorists. <br /><br />From a systems engineering point of view, the class <i>P</i> is similarly a dynamical blend of "<i>P</i>-quarks" and "<i>P</i>-gluons" ... the <i>P</i>-quarks being languages in <i>P</i> that are provably in <i>P</i>, the <i>P</i>-gluons being their complement. The dynamical blending comes when we take unions and intersections of languages ... isolated "quarks" thus naturally become dressed by "gluons"!<br /><br />AFAICT, <i>P</i>-gluon algorithms are known to exist, but none have ever been concretely isolated. Thus, to wonder whether <i>P</i>-gluon isolation is feasible is a very natural question—for engineers at least!<br /><br />And if a concrete <i>P</i>-gluon algorithm recognized some natural or even practically useful language ... why ... wouldn't that constitute a tremendous advance, in both our fundamental understanding of complexity theory, and in our capability to apply that understanding?John Sidleshttp://www.mrfm.orgnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-76262078394711010962010-12-08T21:55:07.745-06:002010-12-08T21:55:07.745-06:00John, no offense, but you write so many long comme...John, no offense, but you write so many long comments on so many blogs that I think you should start your own blog and publish them there not as comments on other people's blogs, your comments are not really comments.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-68944873357817029692010-12-08T13:17:33.685-06:002010-12-08T13:17:33.685-06:00John Sidles, P is a uniform class, and therefore i...John Sidles, <i>P is a uniform class, and therefore is not non-uniform. This is not up for debate.</i><br /><br />Anonymous, your points are absolutely correct ... and similarly, that the criteria for verifying membership in <i>P</i> are <i>not</i> uniform—even in principle—also is not up for debate!<br /><br />Both observations are interesting ... both suggest useful avenues of research in complexity theory ... and this I took to be the broad theme of Lance's thought-provoking and highly enjoyable post. :)John Sidleshttp://www.mrfm.orgnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-29190621290010489472010-12-08T11:58:33.346-06:002010-12-08T11:58:33.346-06:00John Sidles, P is a uniform class, and therefore i...John Sidles, P is a uniform class, and therefore is not non-uniform. This is not up for debate, it follows from the definitions of "P", "uniform", and "non-uniform". Non-uniformity of a class has nothing to do with proofs of running time or correctness.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-62796267801535689682010-12-08T10:52:46.856-06:002010-12-08T10:52:46.856-06:00One possibility is not only that uniform lower bou...One possibility is not only that uniform lower bound matters but that even uniformity is not sufficiently restrictive and still allows much-too-wide class of algorithms when modelling "human feasible" algorithms. So it is interesting if there is an even more restrictive or more uniform notion of "P".<br /><br />This thought motivated this question: http://cstheory.stackexchange.com/questions/1617/stronger-notions-of-uniformizationsGil Kalaihttp://gilkalai.wordpress.com/noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-18690311863519654222010-12-08T07:49:46.500-06:002010-12-08T07:49:46.500-06:00Luke: Well, there always is a proof of any true ar...<b>Luke:</b> <i>Well, there always is a proof of any true arithmetic statement, it's just a question how strong your axioms need to be to reach it.</i><br /><br />To say the same thing in stronger language:<br /><br />-------------<br /><br /><i>P</i> is a nonuniform class; given any fixed, finite axiom set (ZFC for example) <i>P</i> partitions naturally into languages whose membership is provable in that axiom set, versus languages that are in <i>P</i>, but not provably so.<br /><br />-------------<br /><br />One wonders, for which axiom sets can members of *both* natural subsets of <i>P</i> be concretely instantiated (via Gödel-type methods, say)?<br /><br />One wonders also, does progress on the vexing question of <i>P</i> versus <i>NP</i> require separate proof technologies for these two (hugely different) subsets of <i>P?</i>John Sidleshttp://www.mrfm.orgnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-31767180129857335432010-12-07T23:36:46.595-06:002010-12-07T23:36:46.595-06:00Well, there always is a proof of any true arithmet...Well, there always is a proof of any true arithmetic statement, it's just a question how strong your axioms need to be to reach it.<br /><br />I believe it's true for any recursive axiom system, there are infinitely many languages in P without a proof they're in P (diophantine equations with only finitely many solutions seem like candidates). If you want to put a probability measure on that, you might be able to say it's a small or large percentage.Lukenoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-76068324140706567982010-12-07T08:43:03.361-06:002010-12-07T08:43:03.361-06:00To the best of my belief, anonymous, it is precise...To the best of my belief, anonymous, it is precisely the class <i>P</i> as defined in (for example) Oded Goldreich's text <i>P, NP, and NP-completeness</i>. Alternatively, it is the canonical definition of <i>P</i> in Stephen Cook's formal statement of the Millennium Prize problem.<br /><br />The point (informally) is that an language belongs in <i>P</i> if a Turing machine exists that recognizes it in polynomial time. But there is <i>no</i> requirement that there exist a <i>proof</i> that the machine runs in polynomial time.<br /><br />Indeed (AFAICT), our present understanding of the complexity class <i>P</i> is entirely consistent with the hypothesis that for most languages in <i>P</i>, there exists *no* (finite length, finite number of axioms) proof—even in principle—that verifies their membership in <i>P</i>.<br /><br />Supposing that this be the case, then by a natural verification criterion the class <i>P</i> is not uniform (and thus, neither is the class <i>NP</i>). Some algorithms in <i>P</i> have proofs of that membership; others (perhaps the majority?) don't. <br /><br />One wonders whether more rapid progress on the vexing question of separating <i>P</i> from <i>NP</i> might be achieved by "splitting" these classes?<br /><br />We engineers care very deeply about these concrete verification and validation issues ... because for us, oracles that validate membership in <i>P</i> cannot be ordered from any catalog.<br /><br />In the event I am completely misguided with regard to the complexity (or even undecidability) of verifying membership in <i>P</i>, perhaps one of the big-name complexity theorists will blog about it. <br /><br />Such a discussion doubtless would be very interesting and illuminating to me and many! :)John Sidleshttp://www.mrfm.orgnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-49389165140200270262010-12-07T08:41:07.269-06:002010-12-07T08:41:07.269-06:00If I have a non-uniform upper bound, I want to kno...If I have a non-uniform upper bound, I want to know whether there's a uniform upper bound. If I have a uniform lower bound, I still want to know if there's a non-uniform lower bound. Both models are important; it's not that one makes the other irrelevant.<br /><br />Russell ImpagliazzoAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-74290183182367407462010-12-07T07:30:23.469-06:002010-12-07T07:30:23.469-06:00Is the class P that you talk about in this the sam...Is the class P that you talk about in this the same "P" as in P versus NP or just a poorly chosen name for a new class that you are talking about?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-28757092779712941502010-12-06T09:50:27.679-06:002010-12-06T09:50:27.679-06:00These broad considerations regarding uniformity ar...These broad considerations regarding uniformity arise specifically in the study of half-exponential function, that is, functions satisfying the composition relation f(f(x)) ∝ exp(x) (this is called a fractional iterative relation).<br /><br />For complexity-theoretic purposes, it would be very convenient if (1) the coefficient of proportionality could be trivially renormalized to unity, (2) the normalized equation had a unique "naturally best" solution, and (3) that natural solution had a uniformly convergent power series.<br /><br />The set of exponential functions has all of these properties of naturality and uniformity, yet counter-intuitively (and somewhat unfortunately for complexity theory) the set of half-exponential functions has none of them.<br /><br />Of course, most of the special functions of mathematical physics lack uniform series expansions; entire functions like polynomials and exp are rare exceptions, not the generic case. <br /><br />Thus, in practical functional computations one enthusiastically embraces non-uniform algorithms, and this can be accomplished systematically, for example, by exploiting Kummer relations among hypergeometric functions.<br /><br />Does the practical computation of half-exponential functions convey broader lessons relating to the roles of naturality and uniformity in complexity theory and quantum information theory? Under Scott Aaronson's recent topic "Anti-Complexitism" I've been arguing that <a href="http://scottaaronson.com/blog/?p=474#comment-56425" rel="nofollow">the answer is "yes"</a>.John Sidleshttp://www.mrfm.orgnoreply@blogger.com