Tuesday, July 16, 2019

Guest post by Samir Khuller on attending The TCS Women 2019 meeting

(I will post the solution to the problem in the last blog later in the week---probably Thursday. Meanwhile, enjoy these thoughts from Samir Khuller on the TCS Women 2019 meeting.)

Guest Post by Samir Khuller:

Am I even allowed here?” was the first thought that crossed my mind when I entered the room. It was packed with women (over 95%), however a few minutes later, several men had trickled in. I was at the TCS Women spotlight workshop on the day before STOC. Kudos to Barna Saha, Sofya Raskhodnikova, and Virginia Vassilevska Williams for putting this grand (and long needed) event together, which serves as a role model and showcases some of the recent work by rising stars. In addition to the Sun afternoon workshop, the event was followed by both an all women panel and a poster session (which I sadly did not attend).

The rising stars talks were given by Naama Ben-David (CMU), Andrea Lincoln (MIT), Debarati Das (Charles University) and Oxana Poburinnaya (Boston U). After a short break the inspirational talk was by Ronitt Rubinfeld from MIT.  Ronitt’s talk was on the topic of Program Checking, but she made it inspirational by putting us in her shoes as a young graduate student, three decades back, trying to make a dent in research by working on something that her advisor Manuel Blum, and his senior graduate student Sampath Kannan had been working on, and I must say she made a pretty big dent in the process! She also related those ideas to other pieces of work done since in a really elegant manner and how these pieces of work lead to work on property testing.

I am delighted to say that NSF supported the workshop along with companies such as Amazon, Akamai, Google and Microsoft. SIGACT plans to be a major sponsor next year.

The Full program for the workshop is at the following URLhere.

Sunday, July 14, 2019

Two infinite hat problem and a question about what is ``well known''

This is a joint post with David Marcus. You will see how he is involved in my next post.

Two infinite hat problems based on one scenario. I am also curious if they are well known.

1) There are an infinite number of people, numbered 1,2,3,... There are 2 colors of hats. They can all see everyone's hat but their own.

2) The adversary is going to put hats on all the people. They will guess their own hat color at the same time.

3) The people can discuss strategy ahead of time, but must use a deterministic strategy and the adversary knows the strategy.

4) The people want to minimize how many they get wrong.

5) The adversary puts on hats to maximize how many they get wrong.

I ask two questions and one meta-question:

Q1: Is there a solution where they get all but a finite number of the guesses right? (I have blogged about a variant of this one a while back.)

Q2: Is there a solution where they get all but at most (say) 18 wrong. (My students would say the answer has to be YES or he wouldn't ask it. They don't realize that I work on upper AND lower bounds!)

Q3: How well known is problem Q1 and the solution? Q2 and the solution? I've seen Q1 and its solution around (not sure where), but the only source on Q2 that I know of is CAN'T TELL YOU IN THIS POST, WILL IN THE NEXT POST. So, please leave a comment telling me if you have seen Q1 or Q2 and solutions. And if so then where.

Feel free to leave any comments you want; however, I warn readers who want to solve it themselves to not look at the comments, or at my next post.

Thursday, July 11, 2019

Degree and Sensitivity

Hao Huang's proof of the sensitivity conjecture that I posted on last week relied on a 1992 result of Gotsman and Linial. Let's talk about that result.

Consider the set S={-1,1}n. The hypercube of dimension n is the graph with vertex set S and an edge between x = (x1,…,xn) and y = (y1,…,yn) in S if there is exactly one i such that xi ≠ yi. Every vertex has degree n.

We say a vertex x is odd if x has an odd number of -1 coordinates, even otherwise. Every edge joins an odd and even vertex.

Let f be a function mapping S to {-1,1}. The sensitivity of f on x is the number of i such that f(x1,…,xi,…,xn) ≠ f(x1,…,-xi,…,xn). The sensitivity of f is the maximum over all x in S of the sensitivity of f on x.

Let g be the same function as f except that we flip the value on all odd vertices. Notice now that the sensitivity of f on x is the number of i such that g(x1,…,xi,…,xn) = g(x1,…,-xi,…,xn).

Let G be the induced subgraph of vertices of x such that g(x)=-1 and H be induced subgraph on the set of x such that g(x)=1. The sensitivity of f is the maximum number of neighbors of any vertex in G or H.

Consider f as a multilinear polynomial over the reals. The sensitivity conjecture states there is some α>0 such that if f has degree n then f has sensitivity at least nα.

Note g(x1,…,xn)=f(x1,…,xn)x1⋯xn. If f has a degree n term, the variables in that term cancel out on S (since xi2=1) and the constant of the degree n term of f becomes the constant term of g. The constant term is just the expected value, so f has full degree iff g is unbalanced.

GL Assumption: Suppose you have a partition of the hypercube into sets A and B with |A| ≠ |B|, and let G and H be the induced subgraphs of A and B. Then there is some constant α>0 such that there is a node of A or B with at least nα neighbors.

The above argument, due to Gotsman and Linial, shows that the GL assumption is equivalent to the sensitivity conjecture.

Huang proved that given any subset A of the vertices of a hypercube with |A|>2n/2 the induced subgraph has a node of degree at least n1/2. Since either A or B in the GL assumption has size greater than 2n/2, Huang's result gives the sensitivity conjecture.

Sunday, July 07, 2019

Fortran is underated!

(Joint Post with David Marcus who was a classmate of mine at SUNY Stony Brook [now called Stony Brook University]. I was class of 1980, he was class of 1979. We were both math majors.)

David has been reading Problems with a POINT (I'm glad someone is reading it) and emailed me a comment on the following passage which was essentially this post. I paraphrase what I wrote:

I dusted off my book shelves and found a book on Fortran. On the back it said:

FORTRAN is one of the oldest high-level languages and remains the premier language for writing code for science and engineering applications. (NOTE- The back of the book uses Fortran but the spell checker I am using insists on FORTRAN. As a fan of capital letters, I don't mind going along.)

When was the book written?

The answer was surprising in that it was 2012 (the Chapter title was Trick Question or Stupid Question. This was a Trick Question.) I would have thought that FORTRAN was no longer the premier language by then. I also need to dust my bookshelves more often.

David Marcus emailed me the following:

Page 201. Fortran. One clue is that it said "Fortran" rather than"FORTRAN". Fortran 90 changed the name from all upper case. Whether it is the "premier language" depends on what you mean by "premier". It is probably the best language for scientific computing. I used it pretty much exclusively (by choice) in my previous job that I left in 2006. The handling of arrays is better than any other language I've used. Maybe there are some better languages that I'm not familiar with, but the huge number of high-quality scientific libraries available for Fortran makes it hard to beat. On the other hand, I never wrote a GUI app with it (Delphi is best for that).

In later emails we agreed that Fortran is not used that much (there are lists of most-used languages and neither Fortran nor FORTRAN is ever in the top 10). But what intrigued me was the following contrast:

1) David says that its the BEST language for Scientific Computing. I will assume he is right.

2) I doubt much NEW code is being written in it. I will assume I am right.

So---what's up with that? Some options

OPTION 1) People SHOULD use Fortran but DON'T. If so, why is that? Fortran is not taught in schools. People are used to what they already know. Perhaps people who do pick up new things easily and want to use new things would rather use NEW things rather than NEW-TO-THEM-BUT-NOT-TO-THEIR-GRANDMOTHER things. Could be a coolness factor. Do the advantages of Fortran outweight the disadvantages? Is what they are using good enough?

OPTION 2) The amount of Scientific computing software being written is small since we already have these great Fortran packages. So it may be a victim of its own success.

CAVEAT: When I emailed David a first draft of the post he pointed out the following which has to do with the lists of most-used programming languages:

The problem with the lists you were looking at is that most people in the world are not scientists, so most software being written is not for scientists. Scientists and technical people are writing lots of new code. If you look at a list of scientific languages, you will see Fortran, e.g., here and here.

There are several Fortran compilers available. One of the best was bought by Intel some time back and they still sell it. I doubt they would do that if no one was using it. Actually, I think Intel had a compiler, but bought the Compaq compiler (which used to be the Digital Equipment compiler) and merged the Compaq team with their team. Something like that. I was using the Compaq compiler around that time.

One quote from the second pointer I find intriguing. (Second use of the word intriguing. It was my word-of-the-day on my word-calendar).

... facilities for inter-operation with C were added to Fortran 2003 and enhanced by ISO/ICE technical specification 29113, which will be incorporated into Fortran 2018.

I (Bill) don't know what some of that means; however, it does mean that Fortran is still active.

One fear: with its not being taught that much, will knowledge of it die out. We be like Star Trek aliens:

The old ones built these machines, but then died and we can't fix them!

Tuesday, July 02, 2019

Local Kid Makes History

The blogosphere is blowing up over Hao Huang's just posted proof of the sensitivity conjecture, what was one of the more frustrating open questions in complexity.

Huang, an assistant professor in the math department at Emory, settled an open question about the hypercube. The hypercube is a graph on N=2n vertices where each vertex corresponds to an n-bit string and their are edges between vertices corresponding to strings that differ in a single bit. Think of the set of the strings of odd parity, N/2 vertices with no edges between them. Add any other vertex and it would have n neighbors. Huang showed that no matter how you placed those N/2+1 vertices in the hypercube, some vertex will have at least n1/2 neighbors. By an old result of Gotsman and Linial, Huang's theorem implies the sensitivity conjecture.

I won't go through the shockingly simple proof, the paper is well written, or you can read the blogs I linked to above or even just Ryan O'Donnell's tweet.

I have nothing more to say than wow, just wow.