Tuesday, October 11, 2016

Ideal courses for a comp sci dept/I'm glad we don't do it

I once heard it said:

In our data structures course we read Knuth and ignore the proofs

In our algorithms course we read Knuth and ignore the code.

And indeed, there are many topics where the theory-part is in one course and the programming is in another.

With that in mind, here is a different way to offer courses (some depts already do some of this).

1) A course in Algorithms AND Data Structures. Actually I have seen this title on courses but its usually just a theory course. I mean you REALLY PROOF and CODE. Might need to be a year long.

2) Crypto AND Security. You learn crypto and security together and do both. If you only did the crypto relevant to security it might be a semester long and in fact it might already be being done. But I am thinking of really combining these courses--- code and prove. Might be a year long.

3) Formal lang Theory and Practice of Compilers. You do DFA, NDFA, CFG, but also do compiler design. If you also want to do P, NP, and decidability (spell check thinks that decidability is not a word!) then might not quite connect up with compilers, then again in might with theorems like: CFG equivalence is undecidable. Might be a year long.

4) Machine learning AND Prob/stat.

PROS: Theory and Practice would be more united.

CONS: Having year-long courses is hard for the students scheduling their courses. Would having just one semester of any of the above courses make sense?

CONS: Harder to find someone to teach these courses. I'll confess that I prefer to teach formal lang theory without any programming in it.

CAVEAT: I think theorists coming out now know more of the practical counterpart of their area than I did and perhaps than people of my generation did.

CAVEAT: A much less radical thing to do is to put more security in to crypto, more about compilers into Formal Lang Theory, etc. But thats a bit odd now since we DO have a course in security, and a course in compilers. Even so, seems to be a good idea and this I know many schools are doing.


Friday, October 07, 2016

Typecasting from Avi's 60th Celebration

Lance: Live from the Institute for Advanced Study in Princeton, New Jersey, Bill and I are doing a typecast from the Avi Wigderson 60th Birthday Celebration. Hi Bill.


Bill: Hi Lance. Last time I saw you was for the Mike Sipser 60th birthday and next for Eric Allender and Michael Saks.


L: New Jersey again. The Garden State.


B: New Jersey rocks! After all Bruce Springsteen is from here.


L: So was I.


B: You both rock! You are better at math. But I don’t want to hear you sing. I got in last night. How was the first day of the conference.


L: There’s two kinds of talks at a celebration like this. Some people who survey how Avi has shaped the field and their research in particular. Scott Aaronson gave a great talk titled “Avi’s Permanent Impact on me” on how a survey talk on the permanent problem eventually led to Scott’s work on boson sampling.


CuCIPRoXgAA8fRQ.jpg-large


Most people though are giving talks on their latest and greatest research. At least they have some good Avi jokes to start their talks.


B: So what category did Oded Goldreich’s “Canonical depth-three Boolean circuits for multi-linear functions, Multi-linear circuits with general gates, and matrix rigidity” talk fit in.


L: Oded did say the title was a joke and he did start with an Avi story, Actually it was a Silvio and Shafi story about when to leave for the airport.


B: I liked Alex Lubotzky’s ruminations on pure math versus computer science. He went into pure math thinking computer scientists had to know how to use computers. Had he known that they didn't he may have gone into computer science. He likes that his work on expanders has “applications” to computer science.


L: How did you like the rest of his talk.


B: I don’t know--it’s still going on.


L: Once I gave a talk at a logic meeting about generic oracles. People came up to me afterwards so excited that logic had such applications in computer science.


B: I liked Noga Alon’s talk “Avi, Graphs and Communication”. There is a fixed graph G on k nodes with k players each with an n bit string. How many bits do they to communicate over the edges to determine if all the strings are identical? Basically if the graph is 2-connected you need half of the trivial kn bits.


[Intermission]


L: A day has passed and we find ourselves talking again on Friday at IAS. It’s a beautiful day and we are sitting on lawn chairs on the IAS grounds. This is how Einstein must have felt.


B: Although he wouldn’t be skipping a talk on quantum physics.We are listening in through the magic of the Internet.


I liked Noam Nisan’s talk on the complexity of pricing. Perhaps surprisingly a seller can do better bundling two independent items than using individual prices. I plan to use that example in my fair division class. I may even do a short blog post.


L: I can’t wait. Madhu gave a great talk on low-degree polynomials and how error-correcting have gone from obscurity in theoretical computer science in the early 90’s to a standard tool just a few years later. And we have Madhu (inspired by Avi) to thank for that.


B: Eyal Wigderson, son of you know who, is a grad student studying neuroscience. Talked on “Brains are Better Computers than Computers” which I found flattering.


L: He was talking about rats, not your brain Bill.


B: Should I feel complemented or insulted?


L: Yes.


B: It’s kind of nice to have a birthday person’s child talk a conference like this. In TCS there are several including Paul Valiant who talked at Les Valiant’s 60th, Tal Rabin talked at Michael Rabin’s 80th, father and son Edmonds are here, and Molly Fortnow will surely talk at Lance Fortnow’s 60th. I remember when she interviewed us.


L: Let’s leave it at that.


B: Silvio Micali “knighted” Avi and used Byzantine agreement to improve bitcoin-like chains. I was  enlightened to learn bitcoin is so expensive only five companies do it regularly.


L: When people claim to solve P = NP I asked them to mine a few bitcoins and get back me. I have yet to see any bitcoins.


B: I asked them to find Ramsey of 5. I’m still waiting.


L: On that note time to say goodbye until we meet again.


B: Allender/Saks New Jersey again! Take us out Lance.


L: In a complex world, best to keep it simple, and


B/L: HAPPY BIRTHDAY AVI!


Avi_Mosaic_HiRes_small.jpg

Tuesday, October 04, 2016

A Second Order Statement true in (R,+) but not (Q,+)


In my last post I asked

Is there a first order statement true in (R,+) but false in (Q,+)

Is there a second order statement true in (R,+) but false in (Q,+)


First order: No.

One reader said it followed from the Lowenheim-Skolem Theorem. I can believe this but don't see how.

One reader said the theory of (Q,+) is complete. True, but would like a reference.

I would prove it by a Duplicator-Spoiler argument.

Second order: Yes

Some readers thought I had < in my language. I do not so I think that examples using < do not work- unless there is someway to express < in my language.

Some readers thought that second order meant you could quantify over functions. My intent was just to be able to quantify over sets.

Some had the right answers. The one I had in mind was

There exists A,B such that A,B are subgroups with at least two elements that only intersect at 0.

Here is a writeup.






Thursday, September 29, 2016

Give a second order statement true in (R,+) but false in (Q,+) or show there isn't one

Here is a logic question I will ask today and answer next week. Feel free to leave comments with
the answer- you may come up with a different proof than me and that would be great!

Our lang will have the usual logic symbols, quantification over the domain, quantification over subsets of the domain (so second order) the = sign, and the symbol +

Examples of sentences:

(∀ x)(∀ y)[ x+y=y+x]

true over Q,R,N.  False in S_n for n\ge 4 (group of perms of n elements)

(∃ x)(∀ y)[ x+y=y]

true in Q, R by taking 0. not true in {1,2,3,...}

Lets assume it is true and call the x 0

(∀ x)(∃ y)[x+y=0]

True in Q, R, Z, not true in N.

QUESTION ONE: Is there any sentence in the first order theory that is TRUE over (Q,+) but
FALSE over (R,+)?

QUESTION TWO: Is there any sentence in the second order theory that is TRUE over (Q,+)
but false over (R,+)?


Tuesday, September 27, 2016

Who's Afraid

The playwright  Edward Albee passed away earlier this month at the age of 88. I had never actually seen any of his plays so I took the opportunity to watch the 1966 movie Who's Afraid of Virginia Woolf? based on the play.

The play has four characters, George, a history professor in his forties and his wife Martha, the daughter of the University's president. Joining them for a late get together is a young biology professor and his wife. The movie had an incredible cast: Richard Burton, Elizabeth Taylor, George Segal and Sandy Dennis. All nominated for academy awards. The women won.

The movie covers many themes, mostly revolving around the disintegrating relationship between George and Martha. George did not live up to his potential as a drunk Martha did not hesitate to point out in front of all of them.
I actually fell for him. And the match seemed practical too. For a while Daddy thought George had the stuff to take over when he was ready to retire. Anyway, I married the S.O.B. I had it all planned out. First he'd take over the History Department, then the whole college. That's how it was supposed to be! That was how it was supposed to be. All very simple. Daddy thought it was a good idea too. For a while!
Until he started watching for a couple of years and and started thinking it wasn't such a good idea. That maybe Georgie-boy didn't have the stuff! That he didn't have it in him! George didn't have much push. He wasn't aggressive. In fact, he was sort of a flop! A great big, fat flop! 
So here I am, stuck with this flop, this bog in the History Department. Who's married to the president's daughter who's expected to be somebody. Not just a nobody! A bookworm who's so goddamn complacent he can't make anything out of himself.
The movie reflects an earlier time in academics, when all the professors were white males and success was measured by taking charge of a department.

Nevertheless Albee captures a fear many academics have. That one day we may wake up and realize we've become that bog. Stuck in our job because we can't afford to give up tenure, but just going through the motions. It's a fear that motivates us, to make us continually try to do new things and achieve new heights. But it's also a reminder that academics is a long career and one that could bog down before we even notice.

Edward Albee writes a play incredibly uncomfortable to watch yet impossible not to. So rare to see that in mainstream plays or movies today.

Thursday, September 22, 2016

Boris Trakhtenbrot (1921-2016)

I woke up this morning to two pieces of news. Subhash Khot has just been named a MacArthur Fellow, the "genius" award, for his work on unique games. But I also just learned about Monday's passing of the great Russian theorist Boris Trakhtenbrot at the age of 95.

Trakhtenbrot has a number of important results in automata theory, model theory and logic to name just a few areas. In computational complexity we know him best for the Gap Theorem which he proved independently with Allan Borodin. Roughly the gap theorem states that for any computable f there is a computable time-bound t such that DTIME(t) = DTIME(f(t)), every problem solvable in time f(t) can also be solved in time t. For example there is a time bound t such that DTIME(t) = DTIME(2t). This doesn't violate the time hierarchy since t may not be time-constructible. There is nothing special about time here, it works for space or any abstract complexity measure.

Borodin and Trakhtenbrot worked independently because they sat on different sides of the iron curtain during the cold war which very little communication in between. Boris Trakhtenbrot wrote A Survey of Russian Approaches to Perebor (Brute-Force Searches) Algorithms (PDF) that traces this early history of Russian theory. He didn't hold back discussing some of the academic politics in Russia that stifled research into computational complexity and gave us a window into the Russian scientific community in the mid-20th century.

Thankfully the iron curtain has been replaced by a global internet and we can do science together. Let's hope it stays that way.

Wednesday, September 14, 2016

Academic Rankings Foster Competition

This week US News and World Report released their undergraduate rankings earlier this week. A time for schools to brag. US News and World Report used to publish an actual weekly news magazine, now they mostly just focuses on rankings. Besides various categories of undergrad institutions USN&WR ranks engineering and business programs. There are many other ranking systems of varying quality but in the US we take the USN&WR rankings the most seriously.

Computer Science does not get an undergraduate ranking. Computer engineering does--not the same. CS does get rankings as a PhD program, last time in 2014. I've posted on rankings in 2005, on the failed NRC rankings of 2010, on using metrics for rankings, and Bill had his own so-called non-controversial thoughts on rankings.

In the September CACM, Moshe Vardi wrote his editor's column entitled Academic Rankings Considered Harmful
Academic rankings, in general, provide highly misleading ways to inform academic decision making by individuals. An academic program or unit is a highly complex entity with numerous attributes. An academic decision is typically a multi-objective optimization problem, in which the objective function is highly personal. A unidimensional ranking provides a seductively easy objective function to optimize. Yet such decision making ignores the complex interplay between individual preferences and programs' unique patterns of strengths and weaknesses. Decision making by ranking is decision making by lazy minds, I believe.
No potential grad student should decide based solely on rankings but neither can we expect them to solve a highly-complex multi-objective highly-personal optimization problem over all 266 PhD-granting CS departments. They will find ways to narrow down their list of schools somehow and a reasonable independent ranking of CS departments can certainly help.

More importantly rankings cause us to compete against each other. Every CS department wants to raise their rankings (or stay on top) and use that goal to work on strengthening their departments and use rankings to make the case to upper administration and alumni to get the resources needed to continue to grow. By the nature of rankings, not everyone can rise up but we all get better in the process.

Saturday, September 10, 2016

Noam Nisan wins the Knuth Prize

Noam Nisan will receive the 2016 Donald E. Knuth Prize, the highest honor in theoretical computer science, for his work in computational complexity and algorithmic game theory. He'll receive the award at the upcoming FOCS conference.

I've known Noam since we were fellow grad students in Berkeley in 1985 and we have become good friends and colleagues. Noam Nisan started his career in computational complexity and Luca posts about several of his seminal works in derandomization, interactive proofs, communication and circuit complexity. I'll focus this post on Nisan's 1992 paper with Mario Szegedy, On the degree of boolean functions as real polynomials.

Let f be a Boolean function on n variables and p a n-variate polynomial over the reals and suppose for every x in {0,1}n, |f(x)-p(x)| ≤ 1/3. Nisan and Szegedy show that the decision tree complexity of f is bounded by a polynomial in the degree of f. The decision tree complexity of a function is the number of bits one has to query to determine whether f is 0 or 1.

The theorem didn't have an immediate application but soon afterwards I told Noam we found a result that followed from his paper. His ears perked up until I told him the actual result (if P = PSPACE then P = AWPP relative to a Cohen generic oracle).

Later on Nisan-Szegedy would have direct applications to quantum computing, showing that quantum, random and deterministic decision tree complexity for total functions are polynomially related. Just last STOC we saw two papers giving tighter bounds on these relationships.

In 1997 Noam Nisan walked away from complexity and soon thereafter became one of the founding players in algorithmic game theory, co-organizing the 2001 DIMACS workshop that would kickstart this field. He won the 2012 Gödel Prize for his early work on algorithmic mechanism design with Amir Ronen.

Nisan and Szegedy begat one of my most frustrating open questions on the relationship between rational functions and decision tree complexity. I have an application for it but I don't think you really want to know.