## Tuesday, July 31, 2012

### A natural function with very odd properties

Last time I posted some questions. Today I post the answer that I know.

1. Is there a subset of [0,1] that is uncountable and has measure 0?  YES- take the Cantor Set.  Many readers knew this.
2. Is there such a set that is natural? Some comments thought the Cantor Set was natural. I disagree, however
this is a matter of opinion and taste.
3. Is there a function that is continuous everywhere and differential nowhere?  YES- take the Weierstrass Function.  Many readers knew this.
4. Is there such a function that is natural? Some think the Weierstrass Function is natural. I disagree because it was constructed for the sole reason to be continuous everywhere and differential nowhere.  But again, there are those who think it is natural and this is a matter of opinion.
5. Is there a function that has derivative 0 at almost every points of [0,1] even though it is strictly increasing?
YES- take Cantor Functions,
also known as the Devil's staircase. Some readers knew this.
6. Is there a natural such function? YES- and this was the motivation for the post. (I also didn't want to have the example I read about mentioned in the comments of the last post which is why I wanted all posts to be non-anonymous- so if I blocked one I could email the author WHY I blocked it.)

The function I will talk about came about because of a real problem having nothing to do with finding weird functions. Hence I think it is unambiguously natural.

I base my exposition on the exposition in
How to Gamble if you Must
by Kyle Siegrist. The original source is from the classic book Inequalities for Stochastic Processes: How to Gamble
if you must
by Dubbins and Savage. Classic or not, unless that book is free online the future
will credit Siegrist with the result (even though Siegrist references Dubbins and Savage.)

Consider the following game: Let 0 < p < 0.5.
Let 0 ≤ x ≤ 1. We will view p as a probability and x as how
You place a bet (which has to be ≤ what you have).
A coin is flipped with prob of WIN being p and of LOSS being 1-p.
If you win you double your bet. If you lose you lose your bet.
You repeat until you have 0 or you have 1.

You could play TIMIDLY: bet a fixed small amount every time.
This is analyzed. In the paper and is interesting.

You could play BOLDLY: if you have < 1/2 then bet all you have,
and if you have ≥ 1/2 then bet so that if you win you'll have 1
and can stop.

If you play BOLDLY then
what is the probability that you will end with 1?
We will fix p and let F(x) be the prob if you begin with x.
(We assume 0 ≤ x ≤ 1.)
F satisfies the following recurrence:

F(x) = pF(2x) if x ∈ [0,0.5]

F(x) = p + (1-p)F(2x-1) if x  ∈ [0.5,1].

F(0)=0, F(1)=1

This is enough to define F on all rationals between 0 and 1 (use the expansion in base 2).
Then use continuity to define F on all the rationals between 0 and 1.

The paper claims that F has derivative 0 at almost every point of [0,1]
even though its strictly increasing.
I leave this for the readers (translation into English: I don't know the proof).
However, I emailed the author who emailed me the following:

There are proofs in the book Probability and Measure by Patrick
Billingsley. In the third edition, it's Example 31.1 on page 407.  The fact
that F is continuous and strictly increasing on [0, 1] is fairly easy to
see, based on the definition of the underlying random variable.  The fact
that F has derivative 0 is not as easy to see intuitively--the proof is
more of a hard analysis type proof.

The paper assumes that money is continuous and out lives are indefinitely long.
How to Gamble if you're in a Hurry
that looks at these problems from (to quote their abstract) a purely discrete, finistic, and computational viewpoint.

## Monday, July 30, 2012

### Six Questions about unnatrual and natural mathematical objects

Today (Monday) I pose some questions. In my next post (Tuesday) I will post the answers
that I know (some I do not). Some questions are a matter of opinion
in terms of what you consider natural.

1. Is there a subset of [0,1] that is uncountable and has measure 0?
2. Is there such a set that is natural?
3. Is there a function that is continuous everywhere and differential nowhere?
4. Is there such a function that is natural?
5. Is there a function that has derivative 0 at almost every points of [0,1] even though it is strictly increasing?
6. Is there a natural such function?

I request that all comments be NON-anonymous.  (I'll tell you why when I post the answers I know on Tuesday.)

## Wednesday, July 25, 2012

### The Combinatorics of Batman

(I wrote this post about a year ago but waited until the new Batman
movie came out to post it. I haven't seen the movie yet so
there may more possibilities to add to this.)

There have been many versions of the BATMAN story:
comic books (and within that there are several versions),
many movies, and a TV series.

So how many ways can the BATMAN story go?
This is NOT A QUIZ- all of the answers are correct
for some version of BATMAN.

1) As a child Bruce Wayne saw his parents gunned down.  What were they doing before this happened?

a) Watching the movie Zorro.
b) Watching the movie The Lone Ranger.
c) )Watching an Opera.

2) Who shot Bruce Wayne's parents?

a) Joe Chill- a low level mugger.
b) Joe Chill- hired by the mob since Thomas Wayne (Bruce Wayne's father) had once foiled a crime. The orders were to leave the boy alive so it would look like a low level mugging.
c) The gangster who would later become the Joker.

3) Who was the Joker before he became the Joker?

a) A completely innocent chemist-turned-comedian.  He was a good man but a bad comedian. He was just going to do one theft to make ends meet. Things went wrong and he fell into a batch of chemicals and became The Joker.  He is now a bad man but a good comedian.
b) A gangster who was sleeping with his bosses girlfriend.  The boss arranged for him to be killed ,but nstead the gangster falls into a batch of chemicals and becomes The Joker.
c) Hey, its just makeup. But he has scars which may have come from his father, himself, or who knows?
d) There are surely other versions I do not know.  One could probably write a bad PhD on this topic.

NOTE: I don't think the Jokers name is known in any of the versions.
(Contrast: The Riddler's name is Edward Nigma.)

4) Where is the Joker now?

a) In Arkam Asylum.
c) Gee, he really seemed to die but we just know he'll turn up again.

5) Who is Two-face?

a) Harvey Dent, the DA, who was a good man. During a court trial a criminal throws  acid on his face. Now half of his face is scared. This drove him insane and he is now a criminal.
b) Harvey Dent, the DA, who was a good man. Dent and his girlfriend Rachel Dawes are kidnapped.Batman and Commissioner Gordan save him but (1) Rachel dies, and (2) Dent has his face half disfigured.This drives him mad.
c) There are about 5 other people who took on this role. One of them was named Harvey Kent.   No relation to Clark Kent

6) Who else lives in Wayne Manor?

a) Robin, Alfred the Butler, and Aunt Hariet.
b) Robin and Alfred the Butler.
c) Just Alfred the Butler.

7) How well do Batman and Superman get along?

a) Superman is never mentioned.
b) They get along.
c) They don't get along.
d) They REALLY don't get along!

This leads to a combinatorics question and a Batman question.
COMBINATORICS: Assuming that all of these options are independent, how many versions of the
Batman Legend could there be? This should be easy.
BATMAN: Of all of these, how many have actually been realized?
This might take a Batman scholar to figure out.
And I haven't even talked about Robin (nor will I).

There are also variants of the COMBINATORICS question if you disallow certain
combinations. For example, If Aunt Hariet exists then Superman does not.

I'm sure there are more options I don't know about.
If you know any, comment!

## Monday, July 23, 2012

### CCC12- post 4 of 4- Misc Info.

CCC 12 post 4 of 4.
The business meeting and other observations.

1. Programming committee info:
1. There were 119 submissions of which 18 were junk (more on that later). This is pretty large- Paris 2009 had 113 submissions. The junk papers were not so much papers that claim they proved P=NP or P\ne NP; they were papers where it is hard to know what they are claiming. (Hmmm- there are valid papers that, after you go through all the definitions, its not clear what they are claiming.)
2. 34 papers accepted. Second highest to Paris having 37. (In both Paris and this year there was no Rump Session- that could be why, no time for one.)
3. Andrew Drucker won Best Student Paper award for Limitations of Lower-Bound Methods for the
wire complexity of Boolean Operators
.
2. There were roughly 60 people at the conference. Here is a list of all attendance figures up through 2008 (if you know of the attendance in the later years let me know and I will add them).
3. Steering committee:
1. Johan Hastad and Manindra Agrawal have been on the committee but their term has expired so they are now off of it.
2. The steering committee put Madhu Sudan on.
3. An election by people at the meeting put Venkat Guruswami on.
4. Peter Bro Miltersen is stepping down as chair (his term expired) and Dieter is the new chair. Peter will stay on one more year (I think).
4. Future Conferences:
1. 2013: the conference will be in Palo alto, ca, co-located with STOC June 1-4 STOC, June 4-7 CCC. I WILL BE THERE!
2. 2014: the conference will be in Vancouver. Local arrangements chair Valentine Kabanets. There were no competing bids (are there ever?) I WILL BE THERE!
3. 2015: Depends on if there is an FCRC and if we join it. In any case I WILL BE THERE! The Steering committee will decide these things later.
5. This year there were electronic proceedings (I think for the second time.) That's good, and I'm glad we don't have paper, but I would prefer if the proceedings were available on a public website before the conference (SODA has done that) or at least after (if it IS available and I missed that- leave an intelligent comment about it). On the other hand this might not matter much since most of the papers are available on line anyway.
6. My posts on the content of the conference didn't get many comments, nor did I think they would. I hope they were useful to you. Forcing myself to do it was very useful to me, and for that I thank you, the readers. (For abstracts of ALL of the papers without my opinions, see here and press on abstracts (not on the left column).
7. On my Honeymoon in 1991 I went on a cruise and we were cut off from ALL news. When we got back the first thing I heard was The crisis is over, the tanks are leaving Moscow. This was somewhat alarming. They were referring to the 1991 Soviet Coup d'etat attempt. By contrast, in 2012 in Portugal, I heard about the Supreme Court Decision on Health Care within an hour of when it happened. I missed the earlier incorrect accounts. Is it a good or bad that its harder to get away from news coverage? Lance posted on a related topic a while back
8. I was gone for a total of 20 days and did not check email once. (i had a vacation problem on.) I came home to 474 emails of which 20 needed to be responded to. None of them were crucial.

## Thursday, July 19, 2012

### CCC 2012- Post 3 of probably 4

Post 3 of n on CCC 2012. I still don't know what n is.
I summarize the third and fourth day of the conference.
(The fourth day was only a half-day).

Thursday June 27 Morning Session:

1. Matrix Lie Algebra Isomorphism by Grochow. This paper was called (and the link above still uses the old name) Lie Algebra conjugacy. Some Isom problems for Lie Algebras are equivalent to Graph Isomorphism. In general Harder Math does not mean Harder Computationally.
2. On Sunflowers and Matrix Multiplication by Alon, Shpilka, Umans. Erdos-Rado made a conjectures about sunflowers. Coppersmith and Winograd made a conjecture in combinatorics which would, if true, yield a better Matrix Mult Alg. This paper shows (roughly) that if the ER-conj is true then the CW-conj is false. Neither conj sounds that intuitive to me so I don't know what to make of this.
3. Algebras of minimal multiplicative complexity by Blaser and Chokaev. I couldn't find a link- if you know one email it to me.
4. Invited Talk Prospects for Geometric Complexity Theory by Peter Burgisser. So what are the prospects? Mixed. I was happy to hear that there are people working on this, but the talk was not that optimistic.
Thursday June 27 Afternoon Session:
1. A Strong Direct Product Theorem for Quantum Query Complexity by Lee and Roland. Direct Product Conjectures say (roughly) that if a problem takes T blahs to do then doing k of them given to you all at once takes kT blahs to do. There are many of these depending on your interest. This paper shows shows something stronger: If you want to computer just the XOR of the k instances it will take roughly kT quantum queries. (That is not quite right- there are a lot of details I left out.)
2. A Strong Parallel Repetition Theorem for Projection Games on Expanders by Raz and Rosen. A parallel repetition theorem usually says that if the prob of (say) winning one game is p, then the prob of winning n games is pn (That can't be quite right but that's the idea). The real question is how fast does the prob go down. This paper adds to our knowledge on this.
3. Share Conversion and Private Information Retrieval by Beimel, Ishai, Kushilevitz. I couldn't find an online version. If you know of one email the link please. The paper claims to unify and simplify PIR protocols!
4. Nondeterministic Circuit Lower Bounds from Mildly Derandomizing Arthur-Merlin Game by Aydinlioglu and van Melkebeek. There have been theorems along the lines of Lower bounds imply Derand. There have also been some (and this is one of them) along the lines of Derand implies Lower Bounds. As the author points out, when you first hear the result you may think one of two things: (1) GOOD NEWS- to get lower bounds, all I have to do is Derandomize! (2) BAD NEWS- to derandomize I HAVE to prove lower bounds, which are hard to prove. He claims a third interpretation of his paper- if you can derandomize then you can do so in a nice way
5. Hitting Set Generators for Sparse Polynomials over any Finite Fields by Lu. I could not find a link- if you know one email it to me.
6. Pseudorandom Generators for Read-Once ACC0 by Gavinsky, Lovett, and Srinivasan. Or see the talk on video from IAS.
Friday June 28 Morning Session
1. Non-Malleable Extractors with Short Seeds and Applications to Privacy Amplification by Cohen, Raz, and Segev. An extractor (informally) takes two strings and outputs a string that looks random. Strong Extractors and non-malleable extractors are stronger notions. Non-malleable extractors produce strings that look random, even to an adversary who is trying to make trouble. Hence they are good for Privacy Amplification. This paper has unconditional constructions of them!
2. Better Condensers and new extractors from Paravaresh-Vardy Codes by Ta-Shma and Umans I lose track of condensers vs extractors vs expanders vs dispersers.
3. List Decoding Barnes-Wall Lattices by Grigorescu and Peikert List decoding is when, instead of finding what a string decodes to, you find a small set of candidates, one of which is correct. What is a Barnes-Wall lattice? The paper does a good job on this. I wonder if the original paper by Barnes and Wall does as good a job?
4. Space-efficient algorithms for reachability in surface-embedded graph by Stolee and Vinodchandran. One of the consequences of there results is log-space algorithms for some graph-reachability problems. Maybe L=NL!
5. Space Complexity in Polynomial Calculus by Filmus, Lauria, Nordstrom , Thapen, Zewi. Proof complexity. While lower bounds on Resolution are well understood, what about more powerful proof systems? This paper is a good start on that.
6. Combinatorial PCPs with Short Proofs by Or Meir.
I wish the PCP theorem itself had a short proof. Perhaps you can prove that it does.

## Wednesday, July 18, 2012

### CCC 2012: Post 2 of n

CC 2012. I still don't know what n is.
I summarize the second day of the conference.

Wednesday June 27 Morning Session:

1. A Satisfiable Algorithm and Average Case Hardness for Formulas over Full binary Basis
by Seto and Tamaki.
2. Approximating AC
by Beame, Impagliazzo, Srinivasan.
3. DNF Sparsification and Faster Deterministic Counting by Goplan, Meka, Reingold.
All three of the above papers solve a hard problem in slightly-better-than the usual worst case. All three of them (I think) want to use this to obtain lower bounds. (Recall that this is ultimately how Ryan got his lower bound- via an algorithm. I want to see a SODA paper that gets an algorithm via a lower bound.) Something bothers me about this. If we are getting algorithms that beat (albeit, just barely) brute force search, perhaps we should be yelling MAYBE P = NP instead of LETS USE THIS TO SHOW THAT P ≠ NP. Just a thought.
1. Invited talk Communication Complexity and Information Cost: Foundation and New Directions by Toniann Pitassi.
This was an excellent talk. I didn't realize it was an hour long invited survey talk instead of a 20 minute long normal talk.
I kept waiting for
(a) her results, and (b) the talk to end, even though I was enjoying it.
ADDED LATER: Toni emailed me her slides.
They are
here
Wednesday June 27 Afternoon Session:
1. Gaussian Noise Sensitivity and Fourier Tails by Kindler and O'Donnel. They first define a new kind of sensitivity Rotation Sensitivity They then show that this type of sensitivity is subadditive. This is used to obtain simple (or at least simpler) proofs of the the Gaussian Isoperimetric inequality in certain cases and a lower bound on approx max-cut of .8787 (not the best known, but very close and a simpler proof).
2. Junto-Symmetric Functions, Hypergraph Isomorphism, and Crunching by Fischer, Garcia-Soriano, Matsliah. Two Boolean functions are Isomorphic if they are the same up to a relabelling of the variables. Given two Boolean functions, can we tell if they are isomorphic? One way is to obtain the truth table for both of them and then see if some permutation of the variables makes them the same. Even if all we care about is queries, this seems like a lot. Can we do this in less queries? That is, we are asking Property Testing questions. What if we restrict them. Then what? Read the paper to find out!
3. Complexity lower bounds through Balanced Graph Properties by Moshkovitz.
This looks hard! They mention Tensors!
4. Limitations of Lower-Bound Methods for the Wire Complexity of Boolean Operators by Andrew Drucker. First there were pathetic lower bounds on wire complexity. Then after what seemed like half a century, there were (just recently) some slightly less pathetic lower bounds. Even so, that was promising. This paper shows NO, don't get too excited. The new technique will not get better results. Gee Andy Drucker, couldn't you have let us dream just a bit longer. The most depressing talk I have every seen. (It was a GREAT talk and a GREAT result, but still depressing.)
5. Limits on Alternation-Trading Proofs for Time-Space Lower Bounds Fortnow, Lipton, van Melkebeek, Viglas showed that any algorithm for SAT that uses log space must take at least time n1.61 (the golden ratio). Since the golden ratio is a natural number (not literally) it was plausible that current techniques could not improve that. But Ryan Williams got nc where c is 2cos(π/7) which is roughly 1.801. This can't possible be the best we can do since its just such a funny looking number. Ryan Williams wrote a program to try to fine tune the constants and do better but he couldn't do better. So he said NOBODY can do better!!!!. His advisor Manuel Blum believed him, but nobody else did. NOW he has shown, with help from Sam Buss, that NO, nobody can do better. I am impressed that he and Sam Buss were able to formalize what the technique was in a way that really does pin down ALL prior proofs. Had this paper been written first, there would only be one paper. This talk was NOT depressing since, being a new problem, I can believe that Ryan or Sam or someone can look at what the old techniques were and come up with some new ones and break the (I can't believe I am writing this) 2cos(π/7) barrier! And then what? A paper showing that that bound can't be improved, and then improve it. How far can this go? I've heard that really, really, we won't be able to show quadratic lower bounds. We'll see.
6. The Hardness of Being Private Common problem in crypto: Alice has x, Bob has y, they want to computer f(x,y) but they can't know anything about the other ones input accept what they can learn from their input and f(x,y). Are there any functions that people really want to compute like this? YES- the functions involved in a Vickrey Auction. In a Vickrey auction Bob and Alice bid x and y privately. The winner pays the LOSING bid. So if Alice bids 1000 and Bob bids 100, Alice gets to buy it for 100. Formally f(x,y) contains both WHO won and what they paid. The winner does know what the loser bid. But the loser should not know what the winner bid. This paper is about computing this function approximately-private in both the worst case and the average case. The most practical paper at the conference which isn't saying much. (Can the winner use Quantum Money?)

## Monday, July 16, 2012

### CCC12: Post 1 of n

(Post 1 of n on CCC 2012. I don't know how large n is yet.)

I will discuss the papers in the order they were presented.

June 26, 2012. Morning
1. Amplifying Circuit Lower Bounds Against Polynomial Time with Applications by Richard Lipton and Ryan Williams. Of course, what we mean by Applications may differ from what other people call applications. Here is one of the two main results:

Let c,d ≥ 1 and e < 1 such that c < (1-e+d)/d. Either QBF is not solvable in time nc or the Circuit Eval problem cannot be solved with circuits of size nd and depth nc.

This is one of those odd results where its and OR of two things that we think are both true. But there are two unconditional lower bounds that can be derived from it:

1. QBF does not have O(n1.618)-time uniform circuits of depth no(1).
2. For all ε > 0, QBF does not have O(n2- ε)-time uniform circuits of depth no(1).
In his talk Ryan said that he needed the kind of complexity that only people like Dick Lipton still remember. (ADDED LATER: This is a misquote, see Ryan Williams comment below.)
2. Is the Valiant-Vazirani Isolation Lemma Improvable? By Holger Dell, Valentine Kabanets, Dieter van Melkebeek. Probably not. Oh well.

3. Parallel Approximation of min-max problems with application to classical and quantum zero-sum games by Gus Gutoski and Xiado Wu. In this paper they developed a better solution to a classical problem and then applied it to a quantum problem. I don't think this is common-- do you know of any other examples?

4. On the complexity of the Separable Hamiltonian Problem by Andre Chailoux and Or Sattah. (The first name is Or.). This paper looks hard!

5. Quantum Money with Classical Verification by Dimitry Gavinsky. Quantum Money is a funny topic in that it may be feasible in 100 years, but by then we will all have e-money.

Afternoon:
1. The Usefulness of Predicates by Per Austrin and Johan Hastad. There is an hour-long version of the talk done on a blackboard here.

li> Reductions between Expansion Problems by Prasad Raghavendra, David Steurer, and Madhur Tulsiani. The authors claim that the Small Set Expansion Hypothesis (SSEH) is a natural hypothesis. (It has been discussed in earlier papers.) This paper shows that SSEH is equivalent to a subcase of UGC. The authors claim that SSEH is is the combinatorial heart of the UGC. WOW!

2. On Problems as Hard as CNFSAT by Marek Cygan, Holger Dell, Daniel Lokshantov, Daniel Marx, Jesper Nederlof, Yoshio Okamoto, Ramamohan Paturi, Saket Saurabi, Magnus Wahlstrom, The Strong Exponential Time Hypothesis (SETH) states that for every ε > 0 there exists k such that k-SAT cannot be solved in time 2 ε n This paper assumes SETH and proves from it lower bounds on HITTING SET, SET SPLITTING, and NAE-SAT.

3. Testing List H-Homomorphisms A List H-Homomorphism is (informally) a homomorphism that satisfies given constraints. Given the constraints and given oracle access to an alleged hist H-Homomorphism, we want to determine if it IS one or is FAR FROM being one (if its not one but close we don't care.) As usual in property testing we are concerned with O(1) queries and sublinear queries. Both are characterized.

4. A Dichotomy for Real Weighted Holant Problems. by Sangxia Huang and Pinyan Lu. Recall that in Valiant's paper Accidental Algorithms he showed that for monotone planer 3-CNF formulas where each var appears twice (1) finding the number of solutions mod 2 is parity-P complete, but (2) finding the number of solutions mod 7 is in P. Since then there has been a wave of research trying to classify which counting problems are hard (likely complete somewhere) and which are on P. The paper by Huang and Lu is a big jump in this field.

## Friday, July 13, 2012

### North Carolina tries to legislate science

(I will post on CCC 2012 next week. I am still recovering from Jet Lag and going
through 472 emails that piled up on 2.5 weeks, of which 22 were relevent.)

You may have heard the story about the state of Indiana Pi bill:
In 1897 the state of Indiana wanted to legislate a value of pi.  Before reading the true story
1. They wanted to legislate pi=3 since the bible says that pi=3.  (The Bible does have a passage that seems to say pi=3, though that may not be the right interpretation.
See here.)This version of the story puts them in a bad light.
2. They wanted to legislate pi=3 since this makes calculations easier.  They didn't really say pi=3, just that when measuring wheat in silos (or some such) everyone uses pi=3 to make everyones calcuations uniform and easy. (This was in the 1880's- they didn't have Wolframs Alpha).  This version of the story puts them in a good light.

Neither version is true. Look up the link to see what really happened.  Could this happen in America today?
On two issues it certainly could: Evolution and on Climate Change.

The North Carolina House has passed a law saying HOW to predict sea level rise rates.
The method they require predict less sea level rise then... anyone else. And only the
Division of Coastal management is allowed to put out such numbers.  So Independent scientists are not allowed to use... the scientific method.  (Not sure how they will enforce that.) See here for details.

Who wins, who loses?
1. Colbert gets a nice bit about it (which is how I learned about it): here
2. I get a blog out of it.
3. Scientists have to publish less, thus less trees get cut down, so the environment wins.
4. North Carolina looks more ridiculous than Indiana did back in 1897. But there are two differences-
(1) North Carolina understood what the bill was saying and still passed it, Indiana did not.
(2) North Carolina's bill has actual policy consequences, Indiana's pi-bill wouldn't have.
Fortunately the North Carolina Senate did not pass the bill so its not law. Yet.

## Tuesday, July 03, 2012

### Why is it called the Turing Award?

The ACM Turing Award represents the highest honor in the computing research community, the closest we have to a Nobel prize in computer science. In this centenary celebration of Alan Turing, it seems obvious that the award should be named after him.

But back in 1966 when Alan Perlis won the first Turing award, Turing's influence on the field was far less obvious. Turing's model had only a few years earlier found its way into computer science literature, still dominated by automata, parsing and circuits. In 1966, what is now the Foundations of Computer Science conference was just renamed Switching and Automata Theory from Switching Circuit Theory and Logical Design.

So why name the award after Turing back then? This question came up during one of the panel sessions at the ACM Turing Celebration where there were some vague guesses. I asked around during the breaks and nobody seemed to know. I even put the question to Turing biographer Andrew Hodges in Cambridge and he similarly didn't know.

The Knuth prize aside, most awards are named after people who have passed away. Back in 1966 there weren't very many dead computer scientists. Even among Turing's generation, Church, Post, Kleene, even GĂ¶del were alive and kicking. Turing would have only been 54 had circumstances been different. John von Neumann died young from cancer in 1957 but wasn't thought of primarily as a computer scientist. There were also Charles Babbage and Ada Lovelace but I'm not sure they were in people's minds back then.

I'm guessing that Turing was just the default choice for the name of the award and the ACM just got extraordinarily lucky for picking the right person in spite of themselves.