tag:blogger.com,1999:blog-3722233.post8781022874637534712..comments2024-04-13T02:40:13.964-05:00Comments on Computational Complexity: Average Case versus Worst CaseLance Fortnowhttp://www.blogger.com/profile/06752030912874378610noreply@blogger.comBlogger13125tag:blogger.com,1999:blog-3722233.post-18367210958611711072010-11-16T15:23:45.952-06:002010-11-16T15:23:45.952-06:00A bottleneck in average-case analysis is that it i...A bottleneck in average-case analysis is that it is not sufficient to know the input distribution. Distributions need to be tracked throughout the computation. When composing two algorithms P1;P2 sequentially, for inputs from a collection I, the average time of the the composition is the sum of the average times of the programs, T(P1)(I) + TP2(P1(I)) where P1(I) is the output multi-set<br />produced by P1 on I. Even when the distribution of I is known, the distribution of P1(I) generally is not.<br /><br />This made the exact average-case analysis of a number of algorithms, such as Heapsort, unknown or hard. A language in which operations support the tracking of data distributions (represented as random bags) in a modular<br />way, is discussed in<br />http://www.springer.com/computer/theoretical+computer+science/book/978-0-387-73383-8.<br /><br />The approach led to new algorithms, such as Percolating Heapsort, addressing the open problem of the exact average time of Heapsort<br />variants. The language MOQA (MOdular Quantitative Analysis) and the<br />timing tool Distri-Track support static average-case analysis by tracking distributions and summing up of average-times of program components. Both worst-case and average-case can be derived for MOQA programs, including standard deviation and higher moments. Research is ongoing into linking smoothed analysis to this approach. www.ceol.ucc.ieTon Lewhttps://www.blogger.com/profile/10894659119428824593noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-60204649269085084372008-04-07T09:17:00.000-05:002008-04-07T09:17:00.000-05:00What examples do we have of problems that are easy...<I>What examples do we have of problems that are easy from the average-case point of view but hard (provably or otherwise) from the smoothed-analysis point of view ?</I><BR/><BR/>Barany, Vempala, and Vetta proved that the average case complexity of two-player Nash equilibrium is polynomial. In contrast, Chen, Deng, Teng proved that if the smoothed complexity of two-player Nash Equilibrium is polynomial, then every PPAD problem can be solved in randomized polynomial-time.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-88964376775174257782008-02-14T11:18:00.000-06:002008-02-14T11:18:00.000-06:00I have always felt that a good distribution with r...I have always felt that a good distribution with respect to which one should measure average-case complexity, should come from an application. For example, a good input distribution for various congestion-minimizing routing algorithms<BR/>could be random preferential attachment graphs.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-53998956289455220932008-02-14T10:10:00.000-06:002008-02-14T10:10:00.000-06:00What examples do we have of problems that are easy...<I>What examples do we have of problems that are easy from the average-case point of view but hard (provably or otherwise) from the smoothed-analysis point of view </I><BR/><BR/>Despite appearing natural, this is not the right question. In both cases one needs a language or optimization problem and a distribution (in the average case on inputs, in the smoothed case on the perturbation). One can easily pick distributions that "separate" the two - for example, if both distributions are supported on single points the former refers to one input for each input size but the latter can become worst-case complexity.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-17557940003147145412008-02-14T07:55:00.000-06:002008-02-14T07:55:00.000-06:00What examples do we have of problems that are easy...What examples do we have of problems that are easy from the average-case point of view but hard (provably or otherwise) from the smoothed-analysis point of view ?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-67258820671372828702008-02-13T22:34:00.000-06:002008-02-13T22:34:00.000-06:00kids say the darndest thingskids say the darndest thingsLucahttps://www.blogger.com/profile/17835412240486594185noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-27158633539584572642008-02-13T12:54:00.000-06:002008-02-13T12:54:00.000-06:00How important is smoothed analysis?How important is <A HREF="http://www.cs.yale.edu/homes/spielman/SmoothedAnalysis/framework.html" REL="nofollow">smoothed analysis</A>?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-76454957214990874502008-02-13T11:05:00.000-06:002008-02-13T11:05:00.000-06:00I worry though that these average-to-worst case re...<I>I worry though that these average-to-worst case reductions imply less that the average case complexity of these problems are harder than we expected and more that the worst case is easier than we thought.</I><BR/><BR/>In a sense, this is exactly right, and even partially understood. For example, the shortest vector problem (SVP) is NP-hard (even to approximate to within any constant). But lattice-based crypto is based on the hardness of approximating SVP to within a factor of, say, n (the dimension of the lattice). This problem is known to be in coNP, so it's "easier" than exact-SVP. Is it still hard for poly-time algorithms? One assumes.....Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-78917530973622889812008-02-13T10:09:00.000-06:002008-02-13T10:09:00.000-06:00From a non-theorist point of view, complexity resu...From a non-theorist point of view, complexity results seem mainly useful as tools for suggesting whether a particular pproach to solving the problem is feasible. For example, it is quite valuable to know when solving an open problem at least whether it is NP-hard, NEXPTIME-hard, non-elementary, or undecidable.<BR/><BR/>My view of average case complexity was fundamentally shaken after learning that even undecidable problems like the group word problem can have low average case complexity.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-35229708049736807912008-02-13T09:19:00.000-06:002008-02-13T09:19:00.000-06:00why does worst case complexity cover any distribut...why does worst case complexity cover any distribution? Can't the bad case occur with 0 measure?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-3722233.post-38125130649848695112008-02-13T07:26:00.000-06:002008-02-13T07:26:00.000-06:00Whoops. Fixed.Whoops. Fixed.Lance Fortnowhttps://www.blogger.com/profile/06752030912874378610noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-87220761750893998012008-02-13T07:21:00.000-06:002008-02-13T07:21:00.000-06:00Is "worst case complexity" supposed to appear twic...Is "worst case complexity" supposed to appear twice in your quote? (If so, I don't understand how studying the same thing before and after the 80s made you think that something happened in the 80s.)rgrighttps://www.blogger.com/profile/02991214367108471744noreply@blogger.comtag:blogger.com,1999:blog-3722233.post-20121532512480428912008-02-13T07:15:00.000-06:002008-02-13T07:15:00.000-06:00I think worse-case analysis helps to understand th...I think worse-case analysis helps to understand the problem, while average-case analysis helps to solve the problem.Anonymousnoreply@blogger.com