tag:blogger.com,1999:blog-3722233.post111031411883936612..comments2024-05-26T22:10:45.398-05:00Comments on Computational Complexity: Favorite Theorems: Efficient ComputationLance Fortnowhttp://www.blogger.com/profile/06752030912874378610noreply@blogger.comBlogger10125tag:blogger.com,1999:blog-3722233.post-74454577728720596832010-04-01T10:25:43.889-05:002010-04-01T10:25:43.889-05:00There is some biographical information about Cobha...There is some biographical information about Cobham <a href="http://recursed.blogspot.com/2010/04/alan-cobham.html" rel="nofollow">here</a>.Jeffrey Shallithttps://www.blogger.com/profile/12763971505497961430noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1110854631382390582005-03-14T20:43:00.000-06:002005-03-14T20:43:00.000-06:00Researchers are well aware of the target of achiev...Researchers are well aware of the target of achieving quasilinear time algorithms and there has been a small amount of interesting work in the area with that as the goal. However, in contrast to the value of the polytime concept, fixing this as a target has not particularly led to breakthroughs.<BR/><BR/>There are more restrictive criteria that have led to interesting breakthroughs; it's just that the quasilinear time concept hasn't particularly yet been one. (Linear time targets have led people to such breakthroughs, e.g. linear-time coding. ) <BR/><BR/>An even more restrictive criterion: one pass algorithms using small space, so-called streaming algorithms, have also led to very interesting results.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1110768928610269652005-03-13T20:55:00.000-06:002005-03-13T20:55:00.000-06:00The value of any concept is its impact on how it c...<I>The value of any concept is its impact on how it changes your thinking. The notion of polynomial time fundamentally altered the way people thought about computational problems.</I>My point is that it is time for yet another such change in thinking. For a while the idea that polynomial=efficient worked well for us, but as our understanding has deepened it's now time for an update in the definition of what is and isn't efficient. <br /><br />Such adjustments happen every so often, it's not such a big deal. For example, we have now come to accept the word RAM as a perfectly valid and realistic model, along with its sharper bounds.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1110606759198977822005-03-11T23:52:00.000-06:002005-03-11T23:52:00.000-06:00Some of this discussion completely misses the poin...Some of this discussion completely misses the point. The value of any concept is its impact on how it changes your thinking. The notion of polynomial time fundamentally altered the way people thought about computational problems. Of course once we have polynomial time algorithms we can try to reduce the running time from n^2 to nlog n or whatever but the key point is that there is something fundamentally different about polynomial-time versus exponential time approaches to problems.<BR/><BR/>New ways of thinking about problems have much bigger impacts in the long run than precisely which polynomial time bound works. A major reason that the sizes of the problems we try to solve in practice have grown so much these days is the improvement in algorithmic approaches due to this polynomial-time thinking. <BR/><BR/>(BTW: If one is really worried about the practicality of polynomial-time algorithms, Robertson and Seymour have a family of n^3 algorithms for minor-closed families where the enormous constant factors are the main impediments.)Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1110475557680642392005-03-10T11:25:00.000-06:002005-03-10T11:25:00.000-06:00"we can bring it down to n^4"...
I've heard that ..."we can bring it down to n^4"...<br /><br />I've heard that one before and still doesn't fly. Current input sizes in real life applications in which we care about performance are usually in the thousands to the millions. In that range an n^4 algorithm would take somewhere between a trillion operations and a septillion operations. <br /><br />Say, on input size 10,000 the algorithm would take about a year to run on a desktop PC. This is hardly what one would call an efficient algorithm in any practical sense of the word.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1110425179898769662005-03-09T21:26:00.000-06:002005-03-09T21:26:00.000-06:00The reason why n^80 should still be efficient is t...The reason why n^80 should still be efficient is that if we come up with an n^80 algorithm for something, experience tells us that if we work a bit harder, we can always improve it to n^3 or n^4... or at least n^42.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1110404448313890332005-03-09T15:40:00.000-06:002005-03-09T15:40:00.000-06:00Yes, Joe O'Rourke and G. Tewari have an O(n^42) al...Yes, Joe O'Rourke and G. Tewari have an O(n^42) algorithm (yes, 42) for partitioning orthogonal polygons into fat rectangles in polynomial time. This type of decompositions is often studied in the context of VLSI. <br /><br />The authors claimed to have tried to reduce the complexity to even n^41 and failed, meaning that the n^42 algorithm was their best effort and not just the result of sloppy analysis or design.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1110398109131283152005-03-09T13:55:00.000-06:002005-03-09T13:55:00.000-06:00Is there a natural poly-time solvable problem for ...Is there a natural poly-time solvable problem for which the best algorithm takes time more than n^10?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1110395378711575332005-03-09T13:09:00.000-06:002005-03-09T13:09:00.000-06:00Actually there are much earlier published referenc...Actually there are much earlier published references than Cobham on polynomial efficiency. Jeff Shallit at Waterloo tracked a published reference going back to the late XIX century in the context of algorithms for factoring, I think.<br /><br />At any rate a better question is if this definition still makes sense. For one most practical computational models are within a poly-logarithmic factor of each other. <br /><br />As well, to argue that an n^80 algorithm is efficient is nonsense. Is it time for a new definition of efficiency?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-1110333638343732892005-03-08T20:00:00.000-06:002005-03-08T20:00:00.000-06:00Peter Bendix was a sophomore in Knuth's class. Mo...Peter Bendix was a sophomore in Knuth's class. More info can be found at p.320 in "All Questions Answered", Notices of the AMS, Volume 49, Number 3, p.318-324<br /><br /><A HREF="http://www.ams.org/notices/200203/fea-knuth.pdf" REL="nofollow">fea-knuth.pdf</A>Anonymousnoreply@blogger.com