Actually this is an idea due to Levin. Let p_{1},
p_{2}, ... be a list of all programs. On some input m simulate
program p_{1} for half of your computation time, p_{2}
for a quarter of the time, p_{3} for an eighth of the time,
etc., until one of these programs outputs a factor of m. If
p_{i} is the fastest algorithm for factoring then our
algorithm will run in time at most 2^{i} times the running
time of p_{i}. The multiplicative factor 2^{i} is
independent of m but unfortunately could be quite large.

Marcus Hutter gives another algorithm that has a multiplicative factor of 5 but has a large additive constant. The trick is to spend some of your time searching for a proof that an algorithm is correct and runs in certain amount of time. You then only need to simulate the provably fastest algorithm found so far.

Hutter's algorithm works only as fast as the provably best algorithm with a provable running time. It could very well be the case that there is some good heuristic for factoring that does not have a provable running time or proof of correctness. Levin's technique will capture this case.

Of course, neither of these papers gives a practical algorithm as the constants involved go beyond huge. Nevertheless it is still interesting to see the theoretical possibilities of universal search.

## No comments:

## Post a Comment