tag:blogger.com,1999:blog-3722233.post1457580569596784380..comments2024-03-28T18:17:00.135-05:00Comments on Computational Complexity: The Beauty of ComputationLance Fortnowhttp://www.blogger.com/profile/06752030912874378610noreply@blogger.comBlogger4125tag:blogger.com,1999:blog-3722233.post-71062547961939218942017-03-13T00:26:48.417-05:002017-03-13T00:26:48.417-05:00I would also like to disagree somewhat with Lance....I would also like to disagree somewhat with Lance. It is easier to do this if one adopts a Platonic stance: the great conceptual ideas exist out there. What we want is precise definitions that capture them. This is analogous to the physicists desire to capture the universal laws that the world obeys.CSProfhttps://www.blogger.com/profile/07212822875614144307noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-636042034685902642017-03-12T06:11:34.237-05:002017-03-12T06:11:34.237-05:00Surely Lisa’s comment was intended to draw a disti...Surely Lisa’s comment was intended to draw a distinction between science and the arts; when scientists make unsubstantiated claims or express hunches, they should be clear, especially when deviating substantially from what others in their field believe. I haven’t read Rovelli’s book, nor would I recognize such digressions, but it seems he took liberties that fall more on the artistic side of the divide than any of us would be comfortable with.<br /><br />> Maybe that's what separates us from the physicists. <br /><br />To the contrary, leaps of faith and the desire to find general truths pervade physics. An example is the concept of “universality,” which takes notions like the equivalence under any reasonable computational model you write about to a whole new level. Yet we all can (and should) be precise and accurate about distinguishing what is known, believed, hoped for, or plausible. Dana Randallnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-17760433369307454662017-03-10T03:58:09.646-06:002017-03-10T03:58:09.646-06:00Stephen Fenner and Lance Fortnow recently publishe...Stephen Fenner and Lance Fortnow recently published a nice, sweet and short paper of barely 10 pages with the title "Compression Complexity". In section 2 Preliminaries they define a function BB(m) and a program p_m of length at most m which encodes BB(m). At least for me, the way they used the program p_m as a practically usable encoding of the natural number BB(m) felt very illuminating. This goes beyond the ideas I previously associated with the busy beaver function.<br /><br />Their encoding directly uses the time (i.e. the number of steps) till the Turing machine halts, but if we want to encode all natural numbers (and not just BB(m)) in such a practically useable and compact way, then it seems helpful to modify the precise definition of the Turing machine slightly. We give it an additional NOP operation (from the perspective of the Turing machine itself), which tells us to switch between counting the steps and not counting them. The default state (before the execution encounters the first NOP operation) is to count the steps. Now it is easy to produce encodings of N+1, N+M, N*M, and "1 if N<=M and 0 otherwise" from encodings of N and M. Even encodings of "logical formulas involving natural numbers and bounded quantification" giving 1 if they are true and 0 otherwise should be possible (if I am not mistaken).<br /><br />But this elaboration already tries to leave the domain of computation and enter into the realm of logic, so it may be no surprise that precise definitions become desirable there.Jakitohttps://www.blogger.com/profile/08235089048981338795noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-26562826035628730942017-03-09T21:51:31.110-06:002017-03-09T21:51:31.110-06:00Very nice, succinct, and articulate essay! I am m...Very nice, succinct, and articulate essay! I am mostly with you.<br /><br />One question, though: what you've said works nicely in the realm of computability (Church--Turing thesis) or even feasible computability (polynomial-time version of the C--T thesis). Does it work equally well when we speak of efficient computability? Our own notions of efficient has probably undergone changes over the past couple of decades, with the underwhelming progress in massively parallel (think SIMD) computing, surprising progress on "simple" distributed computing (MapReduce, Hadoop type stuff), streaming and sketching computations, etc. If somebody asks me the question "is maximum matching efficiently computable?" I am not sure I have a clear answer. Is that due to lack of sharpness in our models? I don't know.D. Sivakumarhttps://www.blogger.com/profile/05750992965116762335noreply@blogger.com