Monday, March 07, 2016

When do we care about the constants?

I've been reading two books recently: Asymptopia by Joel Spencer (He turns 70 soon!  Workshop for it!. My nephew things that celebrating your bday with a workshop would be... odd) and   The Joy of Factoring by Simon Wagtaff. In terms of content they are on two different topics. In terms of practicality they are different: Asymptopia is clearly a pure math book (there is one chapter on algorithms, but the rest is really pure math) whereas The Joy of Factoring is very practical in that it focuses on real algorithms for the important (for crytography) practical problem of factoring. However, there is one thing the books had in common: They both often care about multiplicative constants.

Example from Asymptopia: They gave better and better lower bounds on Ramsey numbers:

(1) R(k)  ≥  (1+o(1))(k/e sqrt(2)) 2k/2  roughly (1+o(1))(0.26)2k/2

(2) R(k)  ≥  (1+o(1))(k/e) 2k/2 roughly (1+o(1))(1+o(1))(0.37)k/2

(3) R(k)  ≥  (1+o(1))(k/sqrt(2)) 2k/2 roughly (1+o(1))(0.71)k/2

(It may be hard to read so I will clarify- the o(1) is little-o, a term that goes to 0 as k gets large.)

The first lower bound uses the prob method and you the reader has prob seen it or could prob derive it yourself. Prob. The second lower bound uses prob and a  clever way of coloring and then tossing out some vertices. The third lower bound uses the Local Lovasz Lemma.

Note that for this problem Joel Spencer cared about the constant.

Example from The Joy of Factoring: Since many (but not all!) factoring algorithms do not have rigorously proven run times (Number Theory is Hard!) it's harder to give clean examples here. The book often refers to tricks to get constants down and the notion that constants matters permeates the book. Here is one rigorous example of caring about constants:

Fermat's difference-of-squares algorithm goes as follows: We want to factor N. Let x=floor(sqrt(N)). Test each of the following numbers for being a square and stop when you get a square: x2-N, (x+1)2-N, (x+2)2 - N, etc. When you find an r such that (x+r)2-N=y2  then you have (x+r-y)(x+u+y)=N. Almost surely this is a nontrivial factorization of N. (This algorithm is worse than the trivial sqrt(N) algorithm in some cases; however, it has some of the ideas needed for more sophisticated algorithms including the Quadratic Sieve.) Of course, one might be looking for the right r a long time. How long:

Let a be the largest divisor of N that is ≤ \sqrt(N). Let k=a/sqrt(N). Then the search will take

1+ (1-k)2sqrt(N)/(2k)

Again note that there are no hidden multiplicative constants.

So when do we care about constants and why?

1) If you are working on an algorithm for a problem people really want to solve then you need the constants to be small.

2) If you can get good bounds on the exact constants then you should.

3) If you have a technique and try it out you might end up just improving the constant. Even so, you have showed that the technique has merit.

4) Improving the constant may show progress which will later lead to more important improvements.

5) Chicken and Egg:  Here is an example from Asymptopia where he didn't care about the constant: Fix ε. Given three points in the unit square what is the prob that their area will be ≤ ε ?   He showed its Θ(ε).This proof is very nice. Tracking the constants used in his proof looks tedious. In order to care about the constants perhaps we need an interesting proof about them. To look for a proof technique that applies to them perhaps we need to care in the first place. Chicken and Egg?



4 comments:

  1. "Then the search will take 1+(1-k2\sqrt(N)/2k iterations."

    There seems to be a typo here as there are unbalanced parentheses and the expression doesn't make sense.

    ReplyDelete
  2. I think "Local Lovasz Lemma" is generally written as "Lovasz Local Lemma"

    ReplyDelete
  3. What about median finding being between (2+ε)n and 2.95n?

    ReplyDelete