Throughout this post I ignore polylog factors.

It is trivial to factor N in time N

^{1/2}. Pollard's rho-algorithm (see my write up here or Wikipedia Entry) for factoring does bette expected time N

^{1/4}. Or does it? It works well in practice but has not been proven to work well in theory. (If I missed some paper that DID prove it works well in theory please leave a polite comment.)

Here we state a conjectures that, if true, will show that Pollard's algorithm is in time (randomized) N

^{1/4}. Let p be a prime. Let c be in {2,...,p-1}.

Let f

_{c}(x)= x

^{2}+ c mod p.

x

_{1}will be specified in the conjecture. x

_{i}is f(x

_{i-1}).

For at least half of the elements x

_{1}in {2,...,p-1} and at least half of the elements c in {2,...,p-1} the sequence x

_{1}, x

_{2},... will have a repeat within the first O(p

^{1/2}) items.

This is thought to be true since it is thought that the sequence is random-enough so that the birthday paradox will work. Still... no proof.

When reading some algorithms papers the interest is in getting an algorithm that you can PROVE the run time of. By contrast, papers on factoring and discrete log and other things that are used to break crypto systems the interest is more in getting something that actually works. I have to learn to stop thinking ``but they haven't proven that!'' and start thinking ``Oh, yes, that would work''. And to be fair, for Pollard's algorithms and others (e.g., quad sieve, number field sieve, which do better in practice than Pollard for large enough numbers) there are REASONS to think they will work well.

More generally, theoretical and applied work may need different mentalities.