Tuesday, May 30, 2017

Google Scholar thinks my Hilbert Number is 1

(I want to thank Lane Hemaspaandra for bringing this to my attention.)

When I google:

google scholar  William Gasarch

I get  this

(I wonder if what you get depends on who you are- I describe below what I get.)

The first entry on it is

Methods of  Mathematical Physics by Courant and Hilbert.

The first page page that Google brings up has all papers with Hilbert as a co-author except the book

Handbook of discrete and combinatorial mathematics edited by Rosen

I have a short article in that book.

The second page has the paper of Hilbert:

Uber die irreducible ganzer rationaler blah blah

which I have a connection to since Mark Villarino, Bill Gasarch, and Ken Regan have a paper explaining this paper in modern terms here. Could this have confused google scholar into thinking I am Hilbert? Seems unlikely.

In fact, the first two pages are all papers with Hilbert as an author. The third page has

Some connections between bounded query classes and non-uniform complexity by Amir, Beigel, Gasarch.

And after that the pages are a mix of Hilbert's papers and mine, though mostly his since he had so many more papers than I did.

This raises some questions

1) How did this happen? Did I hack google scholar? Many theorists at one time in their lives were excellent programmers- however, I am not one of them.

2) How common is it for google scholar to be this far wrong?

3) If I wanted to get this fixed who would I tell?  In the past whenever I've tried to tell a company something FOR THEIR OWN GOOD I get the same run-around as when I am trying to get information not on their website, so I have STOPPED even trying. Same here?

4) Should I want to get it fixed? I am proud to be affiliated with Hilbert!

Thursday, May 25, 2017

Graduation from the Other Side

I've attended many graduations in my time, mostly as faculty, a couple of times as a student or a brother. This last weekend I attended my first university graduation as a parent as my daughter Annie graduated from Brandeis University in Waltham, Massachusetts. Brandeis has a big graduation ceremony with lots of speeches and then different departments or groups of departments have their own diploma ceremonies with their own speakers and where they give out the actual diplomas.

Brandeis gives out a number of honorary doctorates each year and for the first time gave one to a computer scientist, Turing Award Winner Leslie Lamport. Lamport received his PhD at Brandeis in math in 1972 before they had a CS deparment but now he has an (honorary) PhD in Computer Science. Lamport gave an eight-minute talk in the School of Science ceremony. But when you are a parent the weekend is about your child and my daughter didn't graduate from the school of science so I didn't see the Lamport talk.

In the main ceremony, Brandeis has not only an undergrad give a speech but also a grad student. Sounds like a crazy idea, but Vivekanand Vimal, Neuroscience PhD, gave what could be best described as a performance art. Since I can't find the video of Lamport and you probably don't want to see my videos of Annie, enjoy the new Dr. Vimal's ode to the craziness of the PhD and saving society.

Thursday, May 18, 2017

The Optimizers

Last week the Georgia Tech School of Industrial and Systems Engineering honored the 80th birthday of George Nemhauser and the 70th of Arkadi Nemirovski at an event naturally called NemFest. The Nems are powerhouses in the optimization community and this event drew many of the greats of the field.

In theoretical CS we often take NP-complete as a sign to stop searching for an efficient algorithm. Optimization people take NP-complete as a starting point, using powerful algorithmic ideas, clever heuristics and sheer computing power to solve or nearly optimize in many real-world cases.

Bill Cook talked about his adventures with the traveling salesman problem. Check out his British pub crawl and his tour through the nearly 50,000 US historic sites.

Michael Trick talked about his side job, schedule MLB baseball games, a surprisingly challenging problem. Like TSP, you want to minimize total travel distance but under a wide variety of constraints. "There's something satisfying about being at a bar, seeing a game on the TV and knowing those two teams are playing because you scheduled them." Can't say I've had that kind of satisfaction in my computational complexity research.

Tuesday, May 16, 2017

If an ugrad asks `is field X worth studying' the answer is almost always yes

An undergraduate Freshman recently emailed me that he was very interested in Quantum Computing and wanted to know

1) Who on the fCS aculty works in QC (Answer: Andrew Childs though you should ask him about postdocs, grad students, and Physics faulty in the area.)

2) What are good books on QC for a bright ugrad. I said the following:

QC since Democritus by Aaronson
QC-A gentle introduction by Rieffel and Polak
QC for CS by Yanofsy and Mannucci
QC and QI by Nielsen and Chuang
Some of Scott's blog posts.
Ask Andrew Childs for more.

my webpage of book reviews for SIGACT NEWS here and search for Quantum to get some other books- read the reviews and pick one.

on Amazon type in quantum computing and see what reviews say- though they might not be reliable.

There are likely other good books but I do not know of them. (You can leave comments.)

3) Is QC a good topic to get into? I said YES of course. My reasoning is that they would of course LEARN something by studying it.

 But this raises the question: When would I say `that field is not worth studying' ?

1) If they really want to do RESEARCH and the topic is either too dead or too hard and they want to actually do research (as opposed to learning the topic without wanting to to research).

2) If there was nobody around to help them in that topic. Might still be okay if they are both highly motivated and very smart.

3) If the topic was bogus AND they would learn NOTHING from studying it. Are there topics that are bogus but you still learn from studying them? Does studying astrology seriously teach you some astronomy? Some history? How about Alchemy and Chemistry? Fine if the students KNOWS that Astrology is bogus and Alchemy is not correct.

The points is that I really do not want to dampen someone's enthusiasm for a topic.

SO- aside from the reasons above, can you think of any other reason to discourage a student from a topic they are interested in? I ask, as always, non-rhetorically.

Sunday, May 14, 2017

William Tutte (1917-2002)

Today we celebrate our mothers of course, but also the 100th anniversary of the birth of Bill Tutte, best known for his role in decrypting the Lorenz cipher used by the Nazi high command. Tutte also made many important advances in graph theory and algorithms.

For this post, let's look at one very powerful concept, the Tutte Polynomial, with a rather technical looking definition. Fix a graph G with vertex set V and edge set E with n = |V|. For a subset A of E, let kA be the number of connected components of A and nA be the number of vertices of the vertices of G induced by A, and k=kE the number of connected components of G.

The Tutte polynomial T(x,y) is the sum over all subsets A of E of the quantity

What makes this problem interesting? For some fixed values of x and y we get various properties of the graph.

T(2,1) is the number of forests of G.
T(1,2) number of spanning forests (or spanning trees if G is connected.
T(2,0) is the number of spanning subgraphs.
T(0,2) is the number of strongly connected orientations.
The value (-1)n-kqkT(1-q,0) counts the number of q-colorings of G.

Computing T can be difficult. Counting the number of 3-colorings is #P-complete, equivalent to counting the number of satisfying assignments of a Boolean formula. So even computing T(-2,0) for a given graph G is #P-complete. Leslie Goldberg and Mark Jerrum show that even computing the sign of a Tutte polynomial, just determining whether it is positive, zero or negative, on certain values is still #P-hard.

This is only a sampling of the many applications of the Tutte polynomial. Let's remember Tutte for creating a single function that captures so much information about a graph and helping to defeat the Nazis. Not a bad life. Must have had a good mother.

Thursday, May 11, 2017

How to Solve It

Today a guest post from Periklis Papakonstantinou, coincidentally not unrelated to Bill's post earlier this week. I'll be back with a special post on Sunday.

I'm teaching in an undergrad program that is half computer science and half business at Rutgers, but the CS part taught there is the real thing (I assume for Business too). This term I taught a very theoretical course in cryptography and I realized that (1) the students enjoyed it and (2) that they were lacking basic reasoning skills. I ended up teaching for a few weeks how one can structure basic logic arguments. I am not sure if they appreciated things like the hybrid argument but I believe I convinced them that without rigorous thinking one cannot think clearly.

So, I decided to teach a much more fun class (hopefully next year) titled "How to solve it" -- à la Pólya. The goal is students to develop rigorous problem-solving skills. At the same time, I'd like to use this course as an excuse to introduce basic concepts in combinatorics, linear algebra, and theoretical stats. I'm not sure whether the original book by Polya is appropriate for this and that's why I thought of reaching out to my peers for suggestions. Any ideas and thoughts on possible texts, topics, or notes would be greatly appreciated.

Sunday, May 07, 2017

Students try to memorize rather than understand! Who knew! (everyone)

Discrete Math. Required for CS majors, taken mostly by Sophmores.  Goal is to teach them how to think  rigorously. Topics are logic, number theory (not much), induction, sets, functions, relations, combinatorics (includes Pigeon hole prin, henceoforth PNP), prob, countability, uncountability.

We taught the Pigeon Hole Principle and gave MANY examples and HW of the following type:

Let A be a subset of {1,...,50} of size 10. Show there are two subsets of A that have the same sum.

ANSWER: There are 2^{10} = 1024 possible subsets.

MAXSUM is  41+..+50 = (1+...+ 50 )-(1 +...+ 40) = 50*51/2 - 40*41/2 = 455

MINSUM is 0 (the empty set). So the NUMBER OF SUMS is 456

Since 1024 > 456 there are two subsets of A of the same size.

The EXAM covered PHP, combs, prob, and induction. Hence they should know n choose k

On the HW and in class we NEVER did a problem where we only cared about the subsets of a fixed size. Conceptually this is really the same problem but if you had MEMORIZED  the proof template and tried to apply it you would get it wrong.

I asked the following on the exam: which was worth 20 points.

(20 points) Let A be a subset of {1,...,21} of size 8. Show that A has at least two subsets of size 3 which have the same sum.

ANSWER: There are (8 choose 3) = 56 subsets of A of size 3.
MAXSUM = 19+20+21 = 60, MINSUM = 1+2+ 3 = 6,
 So the NUMBER OF SUMS is 60-5 = 55.

Since 56> 55 there are two subsets of A of size 3 that have the same size.

Grading rubric:

If they got (8 choose 3) thats 3 points.(Many said 2^8- I suspect incorrect memorization)

If they got the MAXSUM and the MINSUM both right (aritj errors- NO penalty) then 3 points
(Many had MINSUM=0- I suspect incorrect memorization).

If they knew to use these PHP then 3 points.

If they got all three right then 20 points

So they got 0,3,6, or 20.

Here is the final tally:

0 points: 85

3 points: 190

6 points: 71

20 points: 167

(For the entire exam: 39 100's, 55 90-99, 77 80-89, 77 70-79, 76 60-69, 70 50-59, 47 40-49, 24 30-39, 24 30-39, 17 20-29, 3 10-19, 3 1-9)

This all raises the much harder question- how can we get students to UNDERSTAND rather than MEMORIZE

Telling them: DO NOT MEMORIZE! TRY TO UNDERSTAND!- I did this. Oh well.

Allowing a cheat sheet (which I did) is both good and bad for this issue.

Giving them a much wider variety of problems of this type. Either they would understand OR they would memorize several different templates.

I WELCOME your thoughts on either my grading or on how to get them to try to UNDERSTAND rather than MEMORIZE.

Thursday, May 04, 2017

Summer Conferences

Ahh summer. No Classes. Baseball. Opera Festivals. Time to focus on research and starting a new book. But, of course, many computer scientists travel the world to various conferences. I went to too many last year and trying to cut down but many great options abound.

The STOC 2017 Theory Fest, June 19-23 in Montreal, five days of conference talks, tutorials, invited lectures and so much more. Sanjeev Arora has the details over at Windows on Theory.

The ACM celebrates 50 years of Turing Awards with a special conference June 23-24 in San Francisco. Tim Berners-Lee takes home this year's prize.

The Computational Complexity Conference, that meeting that shares its domain with this blog, holds its annual get together July 6-9 for the first time in Latvia. Latvia gave us Juris Hartmanis, one of the founders of the field. Travel grants available for students and "needy researchers", you don't have to be an author to apply.

Computability in Europe, June 12-16 in Turku, Finland. Economics and Computation, June 26-30 at MIT. Computational Geometry, July 4-7 in Brisbane. ICALP, July 10-14 in Warsaw. Random/Approx, August 16-18 in Berkeley.

If I missed your favorite events, well that's why we have comments.

Monday, May 01, 2017

A Celebration of Computer Science at Harvard in Honor of Harry Lewis's 70th Bday

My adviser Harry Lewis turned 70 recently. I blogged about how things have changed since I got my Phd in this post. I now post on

A celebration of Computer Science at Harvard in Honor of Harry Lewis's 70th Birthday

(for video of all talks in order see: here)

The title was accurate: most of the speakers (1) were Harvard ugrads, (2) went on to do great things, and (3) Harry Lewis had inspired them. The talks were mostly non-technical and fun!
went on to do great things. Margo Seltzer, a prof at Harvard now (who I TAed many years ago in Aut Theory) orgnaized the event, though she gave lots of credit to her helpers.

0) Marty Chavez was one of Harry Lewis's teaching assistants for a CS programming course and recalled Harry's harsh (but fair) grading polices on code which he later saw the wisdom of.

1) Marty Chavez never thought he would use that HALT is undecidable (I think I might have been his TA for that course). But he found himself telling an egghead of economists who wanted to VERIFY all code to avoid future crashes that... Can't be done. Actually, while that is true, attempts to verify some of it might be a good idea.

2) James Gwertzman noted that:

in 1991 10% of all ugrads at Harvard  had email, and there was no web

in 1995 100%of all ugrads at Harvard had email, and there was web (though primitive).

He then pointed out that a company can do very well by using LOTS of packages that are already out there to use. He named #slack, salesforce, trello, jeaking, mailchip, greenhouse, phabricator, pingdom (just deals with pings- really!), datadog, strips, statuspage, zendeski.

The future will be serverless and codeless.

4) Guy Steele gave the most technical talk and it was, as the kids say, awesome (do adults still say `as the kids say' ?) Here is a version of the talk:

A Logial Concern

Its about how papers at POPL and some other conference have been informally using a language to specify protocols and by now its all bent out of shape. There is also some nice history of math embedded in the talk of which I'll say one thing: one way to group terms together is by placing a bar over them. The most common use of this now is the squareroot sign which didn't always have that bar over the quantity.

Guy's talk even had some slides about his notebooks from Harvard, from a course Harry taughtback in 1974 (the first course Harry taught at Harvard). Part of the course was on the sequent calculus which relates to Guys work and the current paper. Guy's notebook had both material relevent to the current paper and doodles of things like a picture of a Church next to Church's thesis.

The paper was very labor intensive since you can't just use a search program to search for some of the notations he was talking about. For example overbar and underbar. So he had to go through ALL of the POPL proceedings (and a few others) by hand. In 2017. Will that ever be easier?

He also had a two quotes about proofs:

Its not enough to prove something. You must seduce people into believing it

One man's truth is another man's cold broccoli 

I leave it to you to figure out who these quotes are credited to (different people).

5) Stuart Shieber's talk was WWHD (What Would Harry Do).


Promote Character over knowledge (see Harry's book Excellence without a soul- How a great university forgot education)

Pursue the right over the popular

A late talk by Rebecca Nessin told of some things Harry did as Dean that were RIGHT but NOT POPULAR:

The housing at Harvard used to be you chose the house (dorm complex) you lived in. When I was there Dunster was KNOWN to be the Math-house, and others had other reputations that were somewhat accurate. Harry made housing RANDOMIZED (did he use a hardness result to derive a pseudo random generation?) His goal was to increase diversity- people should get to know other kinds of people that are not like themselves.

He made polices do curb underage drinking.

He raised standards for when students get WARNINGS about their performance in classes.

These were all unpopular BUT the right thing to do.

6) There was a panel discussion on teaching.  I'll save this for a later blog post since my random thoughts on this may make this post longer than it should be. I WILL say it was excellent.

7) Rebecca Nessin is the head online course development at Harvard. The courses are  (1) open enrollment  (2) No faculty- all are borrowed from the usual faculty, (3) some courses are online.  She developed a course where the students ARE avatars. Helps with shy students. And text based conversation allows students to get out coherent complete thoughts (CONTRAST- I find myself saying to my students questions ``that was a random sequence of math words'')

That was the first part of her talk.

THEN she began talking about her journey through Harvard and Harry's place in it. Unlike her fellow students she did not what she wanted to do. She took random courses (ancient greek! Multivar calc!) After graduating she still did not know what she wanted to do so she went to... Harvard Law School. While there she took a CS course (what!  You can do that?) and soon after had Aut Theory with Harry. Her PhD was with Stuart Shieber with Harry on the committee and lots of Grammars in it.

Then she told a great story: There was a discussion of raising the min age that someone can get an ugrad  degree at the Harvard extension school. Harry asked who this would affect. The answer was

A small number. Students who can't go to a residential full time school for some reason. This includes competitive athletes, performance artists, deployed military personal, youthful entrepreneurs,  and people with disabilities.

to  which Harry replied:

These are the oddballs. Are we trying to say there is no room for oddballs at Harvard?

Rebecca ended her talk by pointing out that with her crooked path to where she is now she is an oddball and
that all of the oddballs should celebrate that they will also have a place at at Harry Lewis's Harvard.

8) Cliff Young declared Moore's Law Dead (some disagree- see here) - and the solution is to go back to special purpose machines- which, contrary to popular belief, Do NOT just do one thing and
ARE programmable, He also talked about Amdahl's law which is about the limits of parallelism and about how  parallelism research seems to fight the same battle over and over again (RISC vs CISC,

9) Danielle Feinberg from Pixar had the following quote about animation:

Long hair is an unsolved problem

But they did solve it (for the movie The Incredibles). She also pointed out that they can sometimes spend lots of time and energy and creativity on a scene that will take 3 second, or on something just in the background.

Much like Rebecca, Danielle also appreciated Harrys appreciating for oddballs.

10) Harry Lewis- He spoke some about his career but also about CS in general.

 His career and where his now is sort-of an accident. He was originally going to get his PhD in Systems but Theorists got out faster.

 Computer Science has changed a lot in the last X years- but the change he remarked on the most is that it CS is at

The twilight of the Amateur Era

I'll let you debate what that means.

11) Later at the reception Bill Gates and Mark Zuckerberg send their recorded greetings, though only Mark Z's is on the you tube video- towards the end. Its short so rather than summarize it- I urge to to view it yourself.