Monday, August 15, 2022
Monday, August 08, 2022
|Juris Hartmanis 1928-2022|
On Friday, July 29th, I was in the immigration line at an airport in Mexico. My phone rang with Bill Gasarch on the Caller ID but starting vacation I declined the call. The voicemail gave me the sad news that Juris Hartmanis, the person who founded computational complexity and brought me into it passed away earlier that day. I messaged Bill and told him to write an obit and I'd follow with something personal when I returned.
|Hartmanis and Stearns in 1963|
|Juris Hartmanis (center) being toasted by Janos Simon|
- "We all know P is different than NP. We just don't know how to prove it." - Still true.
- "I only make mistakes in the last five minutes of the class." - Sometimes he made a mistake with ten minutes left but only admit it in the last five minutes.
- "Primality is a problem not yet know to be in P but is hanging on by its fingernails with its grip continuing to loosen each day." - Juris Hartmanis said this in 1986, with primality hanging on for another 16 years.
- In 1981 Hartmanis wrote a personal article for the Annals of History of Computing describing the birth of computational complexity and his role in it.
- A CACM interview with Hartmanis gives a personal story of his path from Latvia to Kansas City to Pasadena to Schenectady to Ithaca and DC.
- Obits by Cornell, Ryan Williams and Dick Lipton and Ken Regan.
- My old blog posts celebrating Hartmanis' 90th birthday and the 50th anniversary of Hartmanis-Stearns,
- And the many retrospectives, remembrances, special issues and conferences to come.
Sunday, August 07, 2022
Dan Spielman asked me to blog about the Held Prize. I first present what he send me, and then have some thoughts.
Nominations are now being accepted for the National Academy of Sciences’ 2023 Michael and Sheila Held Prize. The Held Prize honors outstanding, innovative, creative, and influential research in the areas of combinatorial and discrete optimization, or related parts of computer science, such as the design and analysis of algorithms and complexity theory. This $100,000 prize is intended to recognize recent work (defined as published within the last eight years). Additional information, including past recipients, eligibility requirements, and more, can be found at here.
All nominations must be submitted online. Unless otherwise stated, the following materials must be submitted:
A letter from the nominator describing the candidate's work and why he or she should be selected for the award. No more than three (3) pages.
Curriculum vitae. No more than two (2) pages (similar to CVs included with NSF proposals).
Bibliography listing no more than twelve (12) of the nominee's most significant publications.
Suggested citation. A 50-word summary stating why the nominee should be considered for this award. (Citation
Two letters of support. Support letters must be written by individuals from institutions outside both the
nominator's and the nominee’s institution. Up to three letters of support are accepted.
Nominations will be accepted through Monday, October 3, 2022. Please help spread the word that the nomination process is underway.
1) The scope seems rather broad (Dan confirmed this in private email) in that its Comb Opt AND Discrete Opt OR related fields like algorithms and complexity theory.
2) The research has to be Outstanding AND Innovative AND creative AND influential. That seems hard to do :-( If they made it an OR instead of an AND I may ask someone to nominate me for my Muffin Work. It does use 0-1 programming!
3) The past winners are, of course, very impressive. But there is one I want to point out to emphasize that the scope is broad: Amit Sahai won in 2022, and here is what the webpage says about it:
For a leading role in development of cryptographic Software Obfuscation and its applications, starting from initial conception of "Indistinguishability Obfuscation" and culminating in new constructions based upon well-founded cryptographic assumptions. These breakthroughs highlight how computational complexity can enable secrecy while computing in insecure environments.
4) Comb Opt and Discrete Opt seem to be Operations Research. So this inspires the following question:
What are the similarities and differences between Operations Research and Research on Algorithms?
I tend to think of Operations Research as being more tied to the real world- but is that true?
5) Not enough 2-letter combinations for what you want to say: I had to use the term Operations Research instead of the abbreviation OR since I was using OR for or. Oh well.
Saturday, July 30, 2022
Juris Hartmanis, one of the founders of Complexity Theory, passed away on July 29 at the age of 94. The Gödel's Last Letter blog has an obit posted here. Scott Aaronson has some words and a guest post by Ryan Williams here. When other bloggers post obits I will update this paragraph to point to them.
Hartmanis and Stearns shared the 1993 Turing award for the paper On the Computational Complexity of Algorithms (see here for the paper and see here for his Turing Award Talk). In that paper they define DTIME(T(n)) for Turing Machines. They also proved some theorems about how changes to the model (1-tape, 2-tape, 1-dim, 2-dim others) change the notion of DTIME(T(n))- so they were well aware that the definition was not robust. They also have some theorems about computable numbers.
We are used to the notion that you can measure how long a computation takes my counting the number of steps on a Turing Machine. Before the Hartmanis-Stearns paper this was not how people thought of things. Knuth (I suspect independently) was looking at what we might now call concrete complexity- how many operations does an algorithm need. Hartmanis-Stearns were beginning what is now called Complexity Theory.
Recall that later, the Cook-Levin Theorem used Turing Machines.
If Hartmanis-Stearns did not start the process of putting complexity on a rigorous mathematical basis how might complexity theory have evolved? It is possible we would not have the Cook-Levin Theorem or the notion of NP. It is possible that we would ASSUME that SAT is hard and use that to get other problems hard, and also do reverse reductions as well to get some problems equivalent to SAT. Indeed- this IS what we do in other parts of theory with assuming the following problems are hard (for various definitions of hard): 3SUM, APSP, Unique Games. And in Crypto Factoring, DL, SVP, and other problems.
Hartmanis did other things as well. I list some of them that are of interest to me - other people will likely list other things of interest to them.
0) He had 21 PhD Students, some of them quite prominent. The list of them is here.
1) The Berman-Hartmanis Conjecture: All NP-Complete sets are poly isomorphic. Seems true for all natural NP-complete sets. Still open. This conjecture inspired a lot of work on sparse sets including that if a sparse set S is btt-hard for NP, then P=NP (proven by Ogiwara-Watanabe)
2) The Boolean Hierarchy: we all know what NP is. What about sets that are the difference of two NP sets? What about sets of the form A - (B-C) where A,B,C are all in NP? These form a hierarchy. We of course do not know if the hierarchy is proper, but if it collapse then PH collapses.
3) He helped introduce time-bounded Kolmogorov complexity into complexity theory, see here.
4) He was Lance Fortnow's undergraduate advisor.
5) There is more but I will stop here.
Sunday, July 24, 2022
The issue IS that many of the books on the list are not on Number Theory.
Astronomical Algorithms by Meeus (Algorithms for Astronomy)
Too many to name, so I will name categories (Not the type Riehl talks about)
WHAT OF ALL THIS?
Wednesday, July 20, 2022
In my last post I wrote:
Until then I would like you to work on it, untainted by what I know.
I will now say what is known and point to the open problems column, co-authored with Emily Kaplitz and Erik Metz.
If M\equiv 0 mod 4 then there are no n such that a_n \equiv 0 mod M
Monday, July 18, 2022
Until then I would like you to work on it, untainted by what I know.
Monday, July 11, 2022
A while back I reviewed A Map that Reflects the Territory which is a collection of essays posted on the lesswrong forum. My review is here. I posted it to both this blog and to the lesswrong forum. In both cases I posted a link to it. My post to lesswrong is here
On the lesswrong post many of the comments, plus some private emails, told me NO BILL- don't post a link, post it directly as text. It was not clear how to do that, but I got it done with help.
On complexity blog nobody commented that this was a problem. Then again, nobody commented at all, so its not clear what to make of that.
Meta Question: Is posting a link worse than posting direct text? Note that the book review was 12 pages long and looked great in LaTeX.
Meta Question: Why did lesswrong care about the format but complexityblog did not (Probably answer: omplexityblog readers did not care at all, whereas Lesswrong cared about what I though about Lesswrong)
Another Question, not Meta. One of the comments was (I paraphrase)
When I open a pdf file I expected to see something in the style of an academic paper. This is written in very much chatty, free-flowing blog post style with jokes like calling neologisms ``newords'', so the whole think felt more off-kilter than was intended. The style of writing would prob work better as an HTML blog post (which could then be posted directly as a Lesswrong post here instead of hosted elsewhere and linked.)
I think its interesting that the format of an article telegraphs (in this case incorrectly) what type of article it will be. Is this a common problem? I have had the experience of reading a real academic paper and being surprised that some joke or cultural-reference is in it, though I do not object to this.
Another comment and question
I was surprised the post only had 11 karma when I saw it (William had send me an advance copy and I'd really liked reading it) but when I saw that it was a link post, I understood why.
I find this hilarious- they have some way the posts are rated! For one, Lance told me very early on to never worry about comments, and I don't. Second, it reminds me of the Black Mirror episode Nosedive.
ANYWAY, I have reviewed another collection of essays for less wrong, this one called The Engines of Cognition. I am posting it here as a link: here and I will post it on lesswrong as full text (with help) in a few days.
I am posting it so I can get comments before I submit it to the SIGACT News book review column. But this is odd since I think this blog has more readers than SIGACT news has subscribers, so perhaps THIS is its real debut, not that. And of course the lesswrong forum is a place where more will read it since its about them.
So- I appreciate comments to make it a better review!