tag:blogger.com,1999:blog-3722233.post1031792436674372594..comments2024-07-19T08:15:14.879-05:00Comments on Computational Complexity: Slowest Sorting AlgorithmsLance Fortnowhttp://www.blogger.com/profile/06752030912874378610noreply@blogger.comBlogger12125tag:blogger.com,1999:blog-3722233.post-88883023094256790742021-01-09T00:20:34.453-06:002021-01-09T00:20:34.453-06:00Foundational paper for the field:
Pessimal Algorit...Foundational paper for the field:<br /><a href="https://www.mipmip.org/tidbits/pasa.pdf" rel="nofollow">Pessimal Algorithms and Simplexity Analysis</a><br />Broder and Stolfi, ACM SIGACT 16(3), 1984<br />Christopher Wilsonhttps://www.blogger.com/profile/04407289267409149792noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-60841434556173399982020-12-28T12:49:26.586-06:002020-12-28T12:49:26.586-06:00Here is an exponentially slow Prolog program that ...Here is an exponentially slow Prolog program that reverses a list.<br />rev([],[]). <br />rev([A],[A]). <br />rev([A|XB],[B|YA]) :- <br /> rev(XB,[B|Y]), <br /> rev(Y,X),<br /> rev([A|X],YA).<br />Lewis Baxternoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-15269321710897681442020-12-26T16:38:54.398-06:002020-12-26T16:38:54.398-06:00I think I have an answer to this.
If you keep coma...I think I have an answer to this.<br />If you keep comaring things that you don't know how they compare then the algorihtm has to stop in (n choose 2) steps. <br /><br />So is there an algorithm the worst case is (n choose 2)<br />Yes- bubblesort<br /><br />https://www.sparknotes.com/cs/sorting/bubble/section1/<br /><br />For average case there may be a diff answer. gasarchhttps://www.blogger.com/profile/03004932739846901628noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-68857825858588818162020-12-26T14:00:43.651-06:002020-12-26T14:00:43.651-06:00@dom my idea was to select each time an unused Tur...@dom my idea was to select each time an unused Turing machine (for which there is a proof that it is a sorter). <br /><br />... not so natural. <br /><br />In order to make it more natural you can simply consider the length of the input (n) and start the search from the n-th Turing machine and going backward until you find a Turing machine for which there is a valid proof (of size less than n) that it is a sorter, and simulate it on the original input. In this case the "inefficient" backward loop leads to a simulation of all sorters on an infinite number of inputs.<br /><br />The running time is not bounded by any computable function.<br />Marzio De Biasihttps://www.blogger.com/profile/18441670787376943932noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-46260378247229960872020-12-26T00:07:52.308-06:002020-12-26T00:07:52.308-06:00I think this question would be more interesting fo...I think this question would be more interesting for non-redundant simple comparison trees. By this I mean that in every step we can compare two elements whose relation we do not yet know. How bad can we do for the best input, i.e., how long is the shortest branch? Since every input ends up in a different leaf, sometimes we are done with less than nlog n comparisons.domhttps://www.blogger.com/profile/05790539025733385232noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-17288725670321965572020-12-26T00:04:09.342-06:002020-12-26T00:04:09.342-06:00This is not that bad at all. The size (description...This is not that bad at all. The size (description length) of the TM that discards inputs of length less than n and sorts the ones of length at least n, is of order log n, so you only need to go through poly(n) short strings, because these all have short proofs. The only question is which TM comes first in your encoding, but this even might be quite a fast algorithm.domhttps://www.blogger.com/profile/05790539025733385232noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-49147268248761991792020-12-25T04:19:59.269-06:002020-12-25T04:19:59.269-06:00You can give a "logic twist" to the prob...You can give a "logic twist" to the problem.<br /><br />Given an unordered sequence of (input) length n<br />- for all Turing machines Ti of size less than n<br />-- for all proofs Pj (in Peano arithmetic) of size less than n<br />--- check if Pj is a proof that Ti is a valid sorter <br />----- check that Ti has not been used for inputs less than n (in other words, for each n try to use a different Turing machine)<br />----- simulate the first unused Ti on the input<br />----- if no new Ti is found use the last one<br /><br />What is the running time of this sorting algorithm .... ? :-)<br /><br />[re-posted due to "less than" symbols interpreted as html characters]Marzio De Biasihttps://www.blogger.com/profile/18441670787376943932noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-30550578519320687552020-12-25T03:56:16.128-06:002020-12-25T03:56:16.128-06:00This comment has been removed by the author.Marzio De Biasihttps://www.blogger.com/profile/18441670787376943932noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-46295435180542483962020-12-24T16:55:43.876-06:002020-12-24T16:55:43.876-06:00How about designing a really slow algorithm that d...How about designing a really slow algorithm that doesn't sort the array correctly? Is this even betterIgorhttps://www.blogger.com/profile/15711814224090824083noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-9878618155026843922020-12-24T14:53:44.355-06:002020-12-24T14:53:44.355-06:00For circuits, if you take the OR of n-input ANDs, ...For circuits, if you take the OR of n-input ANDs, one AND for each x such that f(x) = 1, this will give a large circuit for any balanced function that fits your formulation (modulo the nots on the input). Lance Fortnowhttps://www.blogger.com/profile/06752030912874378610noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-15119688817906208932020-12-24T13:18:27.350-06:002020-12-24T13:18:27.350-06:00That's some impressively slow algorithms!
One...That's some impressively slow algorithms!<br /><br />One thing I was wondering is whether one can give a formal distinction between artificially slow and genuinely slow. My first instinct was that the concepts "artificial"/"genuine" are too vague to be made formal. But in that twitter thread I made some proposal that possibly makes sense. It is for circuits rather than algorithms.<br /><br />In the meantime, I think I have a simpler reformulation: each gate output must be used, each gate must decrease entropy, and each gate must depend on all its inputs. The size of the circuit is the number of gates.<br /><br />I'll illustrate what I mean by "decrease entropy" on an example. We want to compute the majority function on three variables, called 0,1,2. This circuit does the job:<br /><br />3 := 0 and 1<br />4 := 1 and 2<br />5 := 2 and 0<br />6 := 3 or 4<br />7 := 6 or 3<br /><br />Let's check that the gate (6 := 3 or 4) decreases entropy. Variable 6 is set for 3 inputs (011, 110, 111), so its entropy is 3/8 log(8/3) + 5/8 log(8/5) = 0.954. Variables 3 and 4 are 00 for 5 inputs (000,001,010,100,101), 01 for 1 input (011), 10 for 1 input (110), and 11 for 1 input (111). So, their entropy is 5/8 log(8/5) + 3 (1/8) log(8) = 1.549. Since 1.549>0.954, the gate decreased entropy.<br /><br />The constraints rule out gates with one input: IDENTITY and NOT do not decrease entropy; ZERO and ONE do not depend on their input. Gates with >2 inputs are not ruled out explicitly, but they are probably not desirable since you want many gates.<br /><br />I wonder if this makes sense. If so, does it generalize to algorithms?rgrighttps://www.blogger.com/profile/02991214367108471744noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-64323383870060640752020-12-24T12:33:12.046-06:002020-12-24T12:33:12.046-06:00Steve Mahaney described this problem to me in '...Steve Mahaney described this problem to me in '83, while we were at a conference at the University of New Hampshire. Michael Ingrassia and I took up the challenge. Ingrassia proposed the "generate all permutations, and look for the sorted one." I contributed the idea of finding the sorted permutation by sorting the list of permutations in lexicographic order. The recursive approach was obvious, as was the problem with it -- that the "sub-problems" were larger than the problem you're trying to solve. We dealt with this by simply limiting the number of recursive steps to the length of the original list, then falling back on quickSort ;-). <br /><br />This version has the property that sorting an n element list ultimately results in a call to quickSort with the n-fold iterate of factorial applied to n. Thus, sorting a 2 element list applies quickSort to a 2!! = 2! = 2 element list, whereas sorting a 3 element list applies quickSort to a 3!!! = 6!! = 720! element list.<br /><br />Of couse, you can make this worse still by limited the recursion depth to f n for a suitably defined f, e.g., a total computable function that is total but grows faster than any provably total computable function. We called the result Ingrassia-Kurtz sort, but it never caught on.<br />stuhttps://www.blogger.com/profile/05190631846507740664noreply@blogger.com