a) GN was an upper bound on a problem in Ramsey theory. There are now better upper bounds, see here. These better upper bounds are still large- Hales-Jewitt-Large, but that's already much smaller than the original GN.

b) Other proofs now have numbers even larger than GN. For example Harvey Friedman's work on the finite versions of Kruskal's Tree Theorem. (There may be other cases- if you know some then let me know in the comments.)

Since my dept recently moved buildings I found old papers that I had not looked at in years. One of them was

*Old and New Problems and Results in Combinatorial Number Theory*

by Erdos and Graham

(see here)

So I began reading it and came across a result of Graham from 1964 that used large numbers. No where near as large as GN, but I found it interesting that Graham was involved with large numbers way back then.

Here is the problem:

A

*Lucas Sequence*is a sequence that obeys

a(n) = a(n-1) + a(n-2).

Clearly such a sequence is determined by a(0) and a(1).

QUESTION: Does there exists a(0) and a(1) that are rel prime such that the sequence has only composite numbers?

By ingenuity and some computing power Graham found YES. For how the got the numbers see here. The numbers are of course in the paper, and how they got them is interesting, but I present them anyway. Hope I don't make a typo:

a(0) = 1786772701928802632268715130455793

a(1) = 1059683225053915111058164141686995

The paper Old and New... says its open if there is a smaller pair of numbers, I do not know if it is still open. If you know, let us all know in the comments!

These numbers seem small today since we have modern computers that can store and manipulate them easily. Were the considered large numbers in 1964? They were never called Graham Numbers which is probably just as well since that honor lay ahead.

I think that definition requires some qualification. For instance it's not uncommon in computability theory to just say things like "let s be greater than the largest use of any function on any inputs below a number mentioned so far in this proof." Usually, these are just one part in an infinite construction so don't really count but I bet if you looked carefully you'd find a proof where some lemma in computability theory was implicitly using something that was effectively applying the busy beaver to the busy beaver of some huge number.

ReplyDeleteI mean morally it shouldn't really count because whenever it's really a specific number being used it's usually a result of pure laziness. Ehh, it just needs to be finite and I've only executed finitely many partial functions on finitely many inputs so let's just say the bound is the max of the results of Turing machines with index less than the largest number used so far in this proof with input less than the largest number used so far in this proof.

AH-

DeleteVinogradovs theorem:

Every SUFF LARGE odd numbers can be written as the sum of

three primes.

I asked a number theorist how large suff large was. He said `prob 10^{80} but I haven't bothered to check'

I wonder if anyone had bothered to check.

Later it was shown for all odds \ge 5.

But YES- there could be a theorem like this where the number, if someone bothered to find it, would be bigger than GN or other large numbers in math.

Bill, the link that you gave to the Erdos-Graham paper is off.

ReplyDelete:)

Try this:http://www.math.ucsd.edu/~ronspubs/79_09_combinatorial_number_theory.pdf

thanks.

DeleteFixed.

>Later it was shown for all odds \ge 5.

ReplyDeleteSurely that should be all odds > 5.

the smallest sum of three primes is 6...