Monday, August 15, 2022

A non-controversial question about the Documents Donald Trump had in his house

This is a non-partisan post. In the interest of disclosure I will divulge that I do not think private citizens should have top secret government documents in their house.

One phrase I kept hearing in the reporting was (I paraphrase and may have the number wrong)

The FBI removed 15 boxes of documents

Documents? Like--- on paper?

a) Are the documents also online someplace? Perhaps they intentionally are not so that they can't be hacked.

b) Is having top secret documents only on paper safer than having them in electronic form? Normally I would think so. Donald Trump  having them is a very unusual case.

c) Having to store all of those documents on paper would seem to have storage problems. I can imagine someone with NO bad purposes making copies and taking them home since they are tired of only being about to read them in a special room.

d) A problem with having them ONLY on paper is that if an accident happens and they get destroyed there is no backup. Or is there? Are there copies somewhere? That leads to twice the storage problems.

e) There is a tradeoff of security and convenience. Is having the documents only on paper is an extreme point on the tradeoff, but it may be the right one. It may depend on how important it is to keep the documents secret.

f) I've heard (apocryphal?) stories about some top secret document also available in public though quite legal sources (e.g., a physics journal that discusses nuclear stuff).  Does the government make to much classified? If so then the problem arises of people not taking the classification seriously and getting careless. I doubt that is what happened here.

g) The question I am most curious about is why did he take them? For most of his other actions his motivations are clear (e.g., he is pushing STOP THE STEAL since he wants to be president). But for this one its not clear. Unfortunately,  I doubt we'll ever find out. Maybe the answer is in some documents either on paper or electronic.

Monday, August 08, 2022

The Godfather of Complexity

 Juris Hartmanis 1928-2022

On Friday, July 29th, I was in the immigration line at an airport in Mexico. My phone rang with Bill Gasarch on the Caller ID but starting vacation I declined the call. The voicemail gave me the sad news that Juris Hartmanis, the person who founded computational complexity and brought me into it passed away earlier that day. I messaged Bill and told him to write an obit and I'd follow with something personal when I returned.

 Hartmanis and Stearns in 1963

In November 1962 Hartmanis, working with Richard Stearns at the GE Labs in Schenectady, determined how to use Turing machines to formalize the basic idea to measure resources, like time and memory, as a function of the problem being solved. Their classic Turing-award winning paper On the Computational Complexity of Algorithms, not only gave this formulation but showed that increasing resources increased the problems one can solve. The photo above, from a post celebrating the 50th anniversary of the paper, shows Hartmanis and Stearns with the main theorem of their paper on the board.

Twenty-one years later, a junior at Cornell University still trying to find his way took undergraduate theory from the man himself. Juris brought the topics to life and I found my passion. At the beginning of the class, he said the highest grade usually went to an undergrad followed by the grad students in the class. I was a counterexample, as I had the second highest grade. Never did find out who beat me out.

In spring of my senior year, 1985, I forgave the traditional senior-slump Wines for graduate complexity with Juris. He focused the course around the isomorphism conjecture he developed with his student Len Berman, which implied P≠NP, and Hartmanis believed using the conjecture might lead to settling P v NP. He offered an automatic A to anyone who could prove the isomorphism conjecture. I guess any other proof of P≠NP only warranted a B?

I would later be obsessed by the isomorphism conjecture as an assistant professor, coming up with not one but two oracles making it true. The isomorphism conjecture didn't end up settling P vs NP, but then again neither did any other approach.

It wasn't just me, there was a reason that many of the great American complexity theorists, including Ryan Williams, Scott Aaronson and my own PhD advisor Michael Sipser, were undergrads at Cornell. Many more were PhD students of Hartmanis.

Juris Hartmanis had a certain gravitas in the community. Maybe it was his age, the way he dressed up, his seminal research in the field, or just that Latvian accent. He founded the CS department at Cornell in the 60s and served as head of the CISE directorate at the National Science Foundation in the 90s. His 60th birthday party at the 3rd Structures in Complexity conference (now the Computational Complexity Conference) was the only time I've seen complexity theorists in ties.

 Juris Hartmanis (center) being toasted by Janos Simon

A few of my favorite Hartmanis quotes.
• "We all know P is different than NP. We just don't know how to prove it." - Still true.
• "I only make mistakes in the last five minutes of the class." - Sometimes he made a mistake with ten minutes left but only admit it in the last five minutes.
• "Primality is a problem not yet know to be in P but is hanging on by its fingernails with its grip continuing to loosen each day." - Juris Hartmanis said this in 1986, with primality hanging on for another 16 years.
Thanks Juris for creating the foundations of our field and inspiring so many people, yours truly included, to dedicate ourselves to it.

Sunday, August 07, 2022

The Held Prize for comb. opt. AND Disc Opt AND Alg AND Complexity theory AND related parts of CS.

Dan Spielman asked me to blog about the Held Prize. I first present what he send me, and then have some thoughts.

FROM DAN:

--------------------------------------------------------------------------------------------------

Nominations are now being accepted for the National Academy of Sciences’ 2023 Michael and Sheila Held Prize. The Held Prize honors outstanding, innovative, creative, and influential research in the areas of combinatorial and discrete optimization, or related parts of computer science, such as the design and analysis of algorithms and complexity theory. This \$100,000 prize is intended to recognize recent work (defined as published within the last eight years). Additional information, including past recipients, eligibility requirements, and more, can be found at here.

All nominations must be submitted online. Unless otherwise stated, the following materials must be submitted:

A letter from the nominator describing the candidate's work and why he or she should be selected for the award. No more than three (3) pages.

Curriculum vitae. No more than two (2) pages (similar to CVs included with NSF proposals).

Bibliography listing no more than twelve (12) of the nominee's most significant publications.

Suggested citation. A 50-word summary stating why the nominee should be considered for this award. (Citation

examples)

Two letters of support. Support letters must be written by individuals from institutions outside both the

nominator's and the nominee’s institution. Up to three letters of support are accepted.

----------------------------------------------------------------------------------------------------

1) The scope seems rather broad (Dan confirmed this in private email) in that its Comb Opt AND Discrete Opt OR related fields like algorithms and complexity theory.

2) The research has to be Outstanding AND Innovative AND creative AND influential. That seems hard to do :-(  If they made it an OR instead of an AND I may ask someone to nominate me for my Muffin Work. It does use 0-1 programming!

3) The past winners are, of course, very impressive. But there is one I want to point out to emphasize that the scope is broad: Amit Sahai won in 2022, and here is what the webpage says about it:

For a leading role in development of cryptographic Software Obfuscation and its applications, starting from initial conception of "Indistinguishability Obfuscation" and culminating in new constructions based upon well-founded cryptographic assumptions. These breakthroughs highlight how computational complexity can enable secrecy while computing in insecure environments.

4) Comb Opt and Discrete Opt seem to be Operations Research. So this inspires the following question:

What are the similarities and differences between Operations Research and Research on Algorithms?

I tend to think of Operations Research  as being more tied to the real world- but is that true?

5) Not enough 2-letter combinations  for what you want to say: I had to use the term Operations Research instead of the abbreviation OR since I was using OR for or. Oh well.

Saturday, July 30, 2022

Juris Hartmanis passed away on July 29 at the age of 94

Juris Hartmanis, one of the founders of Complexity Theory, passed away on July 29 at the age of 94.  The Gödel's Last Letter blog has an obit posted here.  Scott Aaronson has some words and a guest post by Ryan Williams here. When other bloggers post obits I will update this paragraph to point to them.

Here is one non-blog obit: here.  For an interview with Juris see here.

Hartmanis and Stearns shared the 1993  Turing award for the paper On the Computational Complexity of Algorithms (see here for the paper and see here for his Turing Award Talk). In that paper they define DTIME(T(n)) for Turing Machines. They also proved some theorems about how changes to the model (1-tape, 2-tape, 1-dim, 2-dim others) change the notion of DTIME(T(n))- so they were well aware that the definition was not robust. They also have some theorems about computable numbers.

We are used to the notion that you can measure how long a computation takes my counting the number of steps on a Turing Machine. Before the Hartmanis-Stearns paper this was not how people thought of things. Knuth (I suspect independently) was looking at what we might now call concrete complexity- how many operations does an algorithm need. Hartmanis-Stearns were beginning what is now called Complexity Theory.

Recall that later, the Cook-Levin Theorem used Turing Machines.

If Hartmanis-Stearns did not start the process of putting complexity on a rigorous mathematical basis how might complexity theory have evolved? It is possible we would not have the Cook-Levin Theorem or the notion of NP. It is possible that we would ASSUME that SAT is hard and use that to get other problems hard, and also do reverse reductions as well to get some problems equivalent to SAT. Indeed- this IS what we do in other parts of theory with assuming the following problems are hard (for various definitions of hard): 3SUM, APSP, Unique Games. And in Crypto Factoring, DL, SVP, and other problems.

Hartmanis did other things as well. I list some of them that are of interest to me - other people will likely list other things of interest to them.

0) He had 21 PhD Students, some of them quite prominent. The list of them is here.

1) The Berman-Hartmanis Conjecture: All NP-Complete sets are poly isomorphic. Seems true for all natural NP-complete sets. Still open. This conjecture inspired a lot of work on sparse sets including that if a sparse set S is btt-hard for NP, then P=NP (proven by Ogiwara-Watanabe)

2) The Boolean Hierarchy: we all know what NP is. What about sets that are the difference of two NP sets? What about sets of the form A - (B-C) where A,B,C are all in NP? These form a hierarchy. We of course do not know if the hierarchy is proper, but if it collapse then PH collapses.

3) He helped introduce time-bounded Kolmogorov complexity into complexity theory, see here.

5) There is more but I will stop here.

Sunday, July 24, 2022

100 Best Number Theory books of all Time---except many are not on Number Theory

I was recently emailed this link:

That sounds like a good list to have!  But then I looked at it.

The issue IS NOT that the books on it are not good. I suspect they all are.

The issue IS that many of the books on the list are not on Number Theory.

DEFINITLY NOT:

A Mathematicians Apology by Hardy

The Universe Speaks in Numbers by Farmelo (looks like Physics)

Category theory in Context by Riehl

A First Course in Mathematical logic and set theory by O'Leary

Astronomical Algorithms by Meeus (Algorithms for Astronomy)

Pocket Book of Integrals and Math Formulas by Tallardia

Entropy and Diversity by Leinster

BORDERLINE:

Too many to name, so I will name categories (Not the type Riehl talks about)

Logic books. Here Number Theory  seems to mean Peano Arithmetic and they are looking at what you can and can't prove in it.

Combinatorics books:  Indeed, sometimes it is hard to draw the line between Combinatorics and Number Theory, but I still would not put a book on Analytic Combinatorics on a list of top books in Number Theory.

Discrete Math textbooks: Yes, most discrete math textbooks have some elementary number theory in them, but that does not make them number theory books.

Abstract Algebra, Discrete Harmonic Analysis, other hard math books: These have theorems in Number Theory as an Application.  But they are not books on number theory.

WHAT OF ALL THIS?

Lists like this often have several problems

1) The actual object of study is not well defined.

2) The criteria for being good is not well defined.

3) The list is just one person's opinion. If I think the best math-novelty song of all time is William-Rowan-Hamilton (see  here) and the worse one is the Bolzano--Weierstrass rap (see here) that's just my opinion. Even if I was the leading authority on Math Novelty songs and had the largest collection in the world, its still just my opinion. (Another contender for worst math song of all time is here.)

4) Who is the audience for such lists? For the Number Theory Books is the audience ugrad math majors? grad math majors? Number Theorists? This needs to be well defined.

5) The list may tell more about the people making the list then the intrinsic qualify of the objects. This is more common in, say, the ranking of presidents. My favorite is Jimmy Carter since he is the only one with the guts to be sworn in by his nickname Jimmy, unlike  Bill Clinton (sworn in as William Jefferson Clinton- a name only used by his mother when she was mad at him) or Joe Biden (sworn in as Joseph Robinette Biden which I doubt even his mother ever used). My opinion may seem silly, but it reflects my bias towards nicknames, just as the people who rank presidents use their bias.

Wednesday, July 20, 2022

What is known about that sequence

In my last post I wrote:

---------------------------

Consider the recurrence

a_1=1

for all n\ge 2, a_n = a_{n-1} + a_{n/2}.

For which M does this recurrence have infinitely many n such that a_n \equiv  0 mod M?

I have written an open problems column on this for SIGACT News which also says
what is known (or at least what I know is known).  It will appear in the next issue.

I will post that open problems column here on my next post.

Until then  I would like you to work on it, untainted by what I know.
----------------------------------------

I will now say what is known and point to the open problems column, co-authored with Emily Kaplitz and Erik Metz.

If  M=2 or M=3 or M=5 or M=7 then there are infinitely many n such that a_n \equiv 0 mod M

If M\equiv 0 mod 4 then there are no n such that a_n \equiv 0 mod M

Empirical evidence suggests that

If M \not\equiv 0 mod 4 then there are infinitely many n such that a_n\equiv 0 mod M

That is our conjecture. Any progress would be good- for example proving it for M=9. M=11 might be easier since 11 is prime.

The article that I submitted is HERE

Monday, July 18, 2022

An open question about a sequence mod M.

In this post n/2 means floor{n/2}

Consider the recurrence

a_1=1

for all n\ge 2, a_n = a_{n-1} + a_{n/2}.

For which M does this recurrence have infinitely many n such that a_n \equiv  0 mod M?

I have written an open problems column on this for SIGACT News which also says
what is known (or at least what I know is known).  It will appear in the next issue.

I will post that open problems column here on my next post.

Until then  I would like you to work on it, untainted by what I know.

ADDED LATER: I have now posted the sequel which includes a pointer to the open problems column. To save you time, I post it here as well.

Monday, July 11, 2022

Review of The Engines of Cognition: Essays From the LessWrong Forum/Meta question about posts

A while back I reviewed A Map that Reflects the Territory which is a collection of essays posted on the lesswrong forum. My review is here. I posted it to both this blog and to the lesswrong forum. In both cases I posted a link to it. My post to lesswrong is here

On the lesswrong post many of the comments, plus some private emails, told me NO BILL- don't post a link, post it directly as text. It was not clear how to do that, but I got it done with help.

On complexity blog nobody commented that this was a problem. Then again, nobody commented at all, so its not clear what to make of that.

So

Meta Question: Is posting a link worse than posting direct text? Note that the book review was 12 pages long and looked great in LaTeX.

Meta Question: Why did lesswrong care about the format but complexityblog did not (Probably answer: omplexityblog readers did not care at all, whereas Lesswrong cared about what I though about Lesswrong)

Another Question, not Meta. One of the comments was (I paraphrase)

When I open a pdf file I expected to see something in the style of an academic paper. This is written in very much chatty, free-flowing blog post style with jokes like calling neologisms newords'', so the whole think felt more off-kilter than was intended. The style of writing would prob work better as an HTML blog post (which could then be posted directly as a Lesswrong post here instead of hosted elsewhere and linked.)

I think its interesting that the format of an article telegraphs (in this case incorrectly) what type of article it will be. Is this a common problem?  I have had the experience of reading a real academic paper and being surprised that some joke or cultural-reference is in it, though I do not object to this.

Another comment and question

I was surprised the post only had 11 karma when I saw it (William had send me an advance copy and I'd really liked reading it) but when I saw that it was a link post, I understood why.

I find this hilarious- they have some way the posts are rated!  For one, Lance told me very early on to never worry about comments, and I don't. Second, it reminds me of the Black Mirror episode Nosedive.

ANYWAY, I have reviewed another collection of essays for less wrong, this one called The Engines of Cognition. I am posting it here as a link: here  and I will post it on lesswrong as full text (with help) in a few days.

I am posting it so I can get comments before I submit it to the SIGACT News book review column. But this is odd since I think this blog has more readers than SIGACT news has subscribers, so perhaps THIS is its real debut, not that. And of course the lesswrong forum is a place where more will read it since its about them.

So- I appreciate comments to make it a better review!