And what do we find under our theory Christmas tree? Our first present is a new randomized polynomial-time algorithm for linear programming from Kelner and Spielman. The card reads

In this paper, we present the first randomized polynomial-time simplex method. Like the other known polynomial-time algorithms for linear programming, the running time of our algorithm depends polynomially on the bit-length of the input. We doOur next present is a list-decodable code from Guruswami and Rudra. Back in October I posted about the Parvaresh-Vardy code that list decode a 1-ε fraction of errors using a code of rate O(ε/log(1/ε)). Guruswami and Rudra, for any constant δ>0, create a code with rate ε-δ.notprove an upper bound on the diameter of polytopes. Rather we reduce the linear programming problem to the problem of determining whether a set of linear constraints defines an unbounded polyhedron. We then randomly perturb the right-hand sides of these constraints, observing that this does not change the answer, and use a shadow-vertex simplex method to try solve the perturbed problem. When the shadow-vertex method fails, it suggests a way to alter the distributions of the perturbations, after which we apply the method again. We prove that the number of iterations of this loop is polynomial with high probability.

Many more presents under the tree, presumably many STOC and Complexity submissions. No proofs (at least correct ones) that P≠NP this year but maybe Santa will come through for us in time for next Christmas.

The Kelner-Spielman paper is great

ReplyDeletebut I think people should

know the following. There are

two main reasons for people's

interest in poly-time simplex type algorithms.

The first is to explain why simplex

performs so well on many real-world

problems. Second, simplex seems to

be the only candidate that

could lead to a strongly polynomial

time algorithm for LP, this being

the main theoretical motivation. Unfortunately

the Kelner-Spielman paper does

several things in addition to

running the shadow vertex simplex

that causes a dependence on numbers.

They have to avoid degeneracy which

is explicitly introduced by their

approach, by doing perturbation.

They need to sample implicity

from an affine transformation that

makes the polytope rounded. Both

steps cause the algorithm to

depend on the bits of the matrix

and it is unclear how this

can be avoided by their approach.

It is nice cute paper following

some geometric ideas in

Spielman-Teng smoothed analysis

for LP. However, there is some

danger here that

people might think that the

main problem is solved. It doesn't

seem that way (IMHO). It would

be nice if other more knowleadgeable

folk can add to the discussion.

I think Christmas and Chanukah land on the same day every 19 years.

ReplyDelete"...but maybe Santa will come through for us in time for next Christmas"

ReplyDeleteThen it won't be Christmas, it will be Armageddon.

Happy Holidays Lance,

ReplyDeleteThanks for a great year of blogging!

Amar

Isn't the Kelner-Spielman algorithm closer to ellipsoid than it is to simplex? I mean, in the ellipsoid algorithm, you guess a bound on the optimal solution (which is where the dependence on input numbers comes in) and then check for feasibility. Here, (I think) they guess a bound on the optimal solution and then check for feasibility via boundedness. A real "simplex" algorithm should have the property that it takes some series of steps along a single polytope. I see how their algorithm is related to a simplex like algorithm, but I really don't see how it can be called a simplex algorithm.

ReplyDeleteI think there are some misconceptions in the above posts about the Kelner-Spielman algorithm that ought to be remedied.

ReplyDelete1. The perturbation to avoid degeneracy that they use does not introduce any additional dependence on the precision of the inputs. As noted in Megiddo's original paper, it can be performed purely formally in terms of a symbolic parameter. The only reason that their algorithm isn't strongly polynomial is that it has to repeatedly revise the probability distribution that it uses for its simplex pivots, and the number of times that it needs to do so depends on the bit-length of the input. It is not obvious that this dependency can be eliminated, but it is also not obvious that it cannot be (by a different pivot rule). This seems to be real progress towards a strongly polynomial algorithm. In particular, it provides a combinatorial algorithmic approach that can work without proving the Hirsch conjecture. This removes a big roadblock...

2. Their algorithm really is a simplex algorithm. They don't walk along the *feasible* vertices of a single polytope, but they do walk along the potential (feasible or infeasible) vertices of a single polytope. Many other simplex algorithms attempt to do this too (e.g., the self-dual simplex method). It's just that nobody had previously been able to design such an algorithm to run in poly time. The geometric stuff that the poster said looked like ellipsoid appeared in the analysis, not in the algorithm--the algorithm really is just a sequence of combinatorial pivots.

Independently of the question of strong polynomiality, it has been a very long open question whether any simplex method can run in poly time. Their paper resolves that, which is really exciting.

"All I Want for Christmas is a Proof that P?NP"

ReplyDeleteFor 2006 ... be nice. I have several elves randomly typing on PCs. Other elves have designed a Turing enumerator for generating proofs. What else can I do? Help me help you!

Dr. S. Claus.

id like to begin to my question with many apologies

ReplyDeletei am an electrical engineering student dealing with optimization in my project. I need to find the best method for solving quadratic programming problem with constraints. Up to now I found active set method, simplex and interior point. From the complexity view of the methods seems interior point is the best one. Is it true? is there a nother type of algorithm?