This week I'm at the CRA Snowbird conference, the biennial meeting of CS chairs and other leaders in the field. In 2012 many of the discussion focused on MOOCS. This year the challenge facing most CS chairs are booming enrollments in CS courses. A nice problem to have, but a problem nevertheless.

Last night we had a broad discussion about the burgeoning number of students. Ed Lazowska showed his NCWIT slides giving anecdotal evidence. It's too early to get a complete survey of CS departments but hardly anyone in the audience felt that enrollments were not going up by double (or triple) digit percentages. Not only an increase in majors, but a number of non-majors, minors or others, who take upper level courses.

At Georgia Tech it seems every engineering student wants to minor in CS.

We all have to deal with the increase in the short term. Depending on the institution, we have more classes, larger classes, enrollment caps, increases in instructors, PhD lecturers, teaching postdocs, automated grading, undergrad and MS TAs and having other departments take up some of the load. All of this could hurt the undergrad experience but with careful planning we can mitigate those effects.

What's driving the increase? Some suggested a change in the ubiquitous view of computing and the more positive opinion of geeks in society. More likely though, the driver is jobs, the great need for computer scientists and not just from computing companies, and the limited jobs for those graduating with non-technical majors.

Is the big shifts in enrollment a permanent change or just another part of the boom and bust cycle in CS? More than a theoretical questions, a permanent change makes for a better argument with deans and provosts to increase faculty sizes in computer science. There are some signs of "this time is different," such as the increase in non-majors in upper level courses, but it's an argument that will have a hard sell unless the increase is sustained for several years. One good argument: If you draw a linear line through enrollments since the 70's you get a consistent 10% yearly increase averaged over booms and busts.

# Computational Complexity

Computational Complexity and other fun stuff in math and computer science from Lance Fortnow and Bill Gasarch

## Tuesday, July 22, 2014

## Thursday, July 17, 2014

### Elfdrive

New York Times, dateline June 11, 2019

With a near record-setting investment announced last week, the self-driving car service Elfdrive is the hottest, most valuable technology start-up on the planet. It is also one of the most controversial.If you haven't guessed all I did was minor edits to a Farhod Manjoo piece on Uber. My point is that it all fits, there really isn't that much difference between a car driven by a magical AI elf or that driving by a real person, if it's all controlled by an app. You can start to see the effects of self-driving cars, the good and the bad, from what's going on now with ride-sharing services. The big difference with self-driving cars will be the complaints won't come just from the taxi drivers, but the truckers and delivery people as well.

The company, which has been the target of protests across Europe this week, has been accused of a reckless attitude toward safety, of price-gouging its customers, of putting existing cabbies out of work and of evading regulation. And it has been called trivial. In The New Yorker last year, George Packer huffed that Elfdrive typified Silicon Valley’s newfound focus on “solving all the problems of being 20 years old, with cash on hand.”

It is impossible to say whether Elfdrive is worth the $117 billion its investors believe it to be; like any start-up, it could fail. But for all its flaws, Elfdrive is anything but trivial. It could well transform transportation the way Amazon has altered shopping — by using slick, user-friendly software and mountains of data to completely reshape an existing market, ultimately making many modes of urban transportation cheaper, more flexible and more widely accessible to people across the income spectrum.

Elfdrive could pull this off by accomplishing something that has long been seen as a pipe dream among transportation scholars: It has the potential to decrease private car ownership.

In its long-established markets, like San Francisco, using Elfdrive every day is already arguably cheaper than owning a private car. Elfdrive says that despite dust-ups about “elfpricing” at busy times, its cheapest service is usually 30 percent less expensive than taxis.

The competition between Elfdrive and its rivals is likely to result in more areas of the country in which self-driving becomes both cheaper and more convenient than owning a car, a shift that could profoundly alter how people navigate American cities.

Over the next few years, if Elfdrive do reduce the need for private vehicle ownership, they could help lower the cost of living in urban areas, reduce the environmental toll exacted by privately owned automobiles (like the emissions we spew while cruising for parking), and reallocate space now being wasted on parking lots to more valuable uses, like housing.

Paradoxically, some experts say, the increased use of self-driving cars could also spawn renewed interest in and funding for public transportation, because people generally use taxis in conjunction with many other forms of transportation.

In other words, if Elfdrive and its competitors succeed, it wouldn’t be a stretch to see many small and midsize cities become transportation nirvanas on the order of Manhattan — places where forgoing car ownership isn’t just an outré lifestyle choice, but the preferred way to live.

## Sunday, July 13, 2014

### What to call the top and bottom part of (n choose k)

In my last post I asked for candidates for names for the top and bottom part of (n choose k) . Here are the candidates and my comments on them and ... the winner!

While choices 4,8,9 are tempting along those lines, the winner is

Numerator/Denominator

Why? One of the people who suggested it gave a pointer. The pointer actually went to calling the top and bottom part of the Legendre symbol Numerator and Denominator. And thats just it- there are several

other math things that have a top and bottom part. We could try to find a name for each one, OR

just use Numerator and Denominator for all of them. That seems to work. SO- next time you write a paper and need to refer to the top part of a bin coeff or a leg symbol or something else, use the terms

Numerator and Denominator.

CODA: `The Denominator' could be an Arnold Schwarzenegger character. A Math teacher by day, a crimefighter by night. Catchphrase: I'm going to put you under!

- Top part: Degree, Bottom part: Index.
- Top part: Bino, Bottom part: Mial
- Top part: Numerator, Bottom part: Denominator
- Top part: Outcomes, Bottom part: Possibilities
- Top part: Binomerator, Bottom part: I've got nothing
- Top part: *, Bottom part: *
- Top part: Biponendo, Bottom part: Bividendo
- Top part: Choosand, Bottom part: choosee
- Top part: Set size, Bottom part: Subset size.

While choices 4,8,9 are tempting along those lines, the winner is

Numerator/Denominator

Why? One of the people who suggested it gave a pointer. The pointer actually went to calling the top and bottom part of the Legendre symbol Numerator and Denominator. And thats just it- there are several

other math things that have a top and bottom part. We could try to find a name for each one, OR

just use Numerator and Denominator for all of them. That seems to work. SO- next time you write a paper and need to refer to the top part of a bin coeff or a leg symbol or something else, use the terms

Numerator and Denominator.

CODA: `The Denominator' could be an Arnold Schwarzenegger character. A Math teacher by day, a crimefighter by night. Catchphrase: I'm going to put you under!

## Wednesday, July 09, 2014

### Is there a word for the top part of a binomial coefficient?

Consider the sequence:

x/y, (x+1)/y, (x+2)/y, ..., (x+z)/y

one could say

Consider the sequence

(x choose y), (x+1 choose y), (x+2 choose y),...,(x+z choose y)

one could say

Is there a better way to say this? That is, is there a term for the top-part of a binomial coefficient? Or for that matter the bottom part? I have not been able to find one on the web. Hence I propose a contest:

x/y, (x+1)/y, (x+2)/y, ..., (x+z)/y

one could say

*in this sequence of fractions the numerators goes through z-1 consecutive numbers.*Consider the sequence

(x choose y), (x+1 choose y), (x+2 choose y),...,(x+z choose y)

one could say

*in this sequence of binomial coefficients the top-part goes through**z-1 consecutive numbers.*Is there a better way to say this? That is, is there a term for the top-part of a binomial coefficient? Or for that matter the bottom part? I have not been able to find one on the web. Hence I propose a contest:

- Leave as a comment a proposed name for the top-part and for the bottom-part of a binomial coefficient.
- If you find a website that has some official name, leave that as a comment.

## Monday, July 07, 2014

### Favorite Theorems: Compressing the Local Lemma

Not only did Moser give a near optimal construction, but he did so in my favorite STOC talk of the decade.

A Constructive Proof of the Lovász Local Lemma by Robin Moser

The Lovász local lemma informally states that if you have a large set of events with limited dependence and they individually have a reasonable chance of happening, then there is a positive probability that they all happen. Moser focused on the special case of Boolean formula satisfiability: If we have a k-CNF formula φ where each clause shares a variable with at most r other clauses with r<2

Moser gave this simple algorithm and a proof using entropy that I recognized as a Kolmogorov complexity argument and so excited me that I immediately wrote up the Kolmogorov proof as a blog post.

Moser's paper kickstarted a whole research agenda, with tight bounds using even simpler algorithms (though more complex proofs) and various generalizations. A whole tutorial day of STOC 2014 focused on the local lemma describing research that directly or indirectly came from Moser's most beautiful construction.

^{k}/8 then φ is satisfiable. Note the result does not depend on the number of variables or clauses.Moser gave this simple algorithm and a proof using entropy that I recognized as a Kolmogorov complexity argument and so excited me that I immediately wrote up the Kolmogorov proof as a blog post.

Moser's paper kickstarted a whole research agenda, with tight bounds using even simpler algorithms (though more complex proofs) and various generalizations. A whole tutorial day of STOC 2014 focused on the local lemma describing research that directly or indirectly came from Moser's most beautiful construction.

## Wednesday, July 02, 2014

### Four Centuries of Logarithms

I just returned from visiting my former student Rahul Santhanam in Edinburgh. The National Museum of Scotland has an exhibit on logarithms, first published in a book by John Napier in 1614.

Napier invented logarithms to make calculations like multiplication, division and exponentiation easier, using identities like log(ab)=log(a)+log(b). Logarithmic tables and slide rules came soon thereafter. Slide rules became the standard way to do mathematical computations for your typical scientist/engineer until reasonably priced calculators appeared in the late 70's. My chemistry teacher in 1980 taught us how to use a slide rule, even though we all had calculators, and I even (foolishly) tried using my father's slide rule on a chemistry test.

The exhibit struggled to find current uses for logarithms, mentioning only the Richter scale. In theoretical computer science we use logarithms all the time. Here's an incomplete list off the top of my head.

Napier invented logarithms to make calculations like multiplication, division and exponentiation easier, using identities like log(ab)=log(a)+log(b). Logarithmic tables and slide rules came soon thereafter. Slide rules became the standard way to do mathematical computations for your typical scientist/engineer until reasonably priced calculators appeared in the late 70's. My chemistry teacher in 1980 taught us how to use a slide rule, even though we all had calculators, and I even (foolishly) tried using my father's slide rule on a chemistry test.

The exhibit struggled to find current uses for logarithms, mentioning only the Richter scale. In theoretical computer science we use logarithms all the time. Here's an incomplete list off the top of my head.

- As part of a running time usually as a result of divide and conquer. Sorting, for example, takes Θ(n log n) comparisons.
- The size of a number, pointer or counter. To count up to n requires log n bits of storage.
- The representation of a small amount of space as in the complexity classes L and NL.
- To capture entropy, coupon collector and other topics in probability and information.
- Roughly captures the sum of 1/i or exactly capturing the integral of 1/x.
- The inverse of exponential growth.

Thanks John Napier for the logarithm and making our lives just a little less linear.

## Sunday, June 29, 2014

### 3000 lunches

Last week fortune smiled on two of my friends:

$3,000,000 is the largest amount of money for prizes of this sort (the Nobel is a paltry $1,200,000). Even so, I had not heard of the award until there was a math one. I wonder if it will become a household name like the Nobel has.

A few weeks ago I told my student Tucker, who is going to Digipen- a 4-year college with a BIG emphasis on programming computer games- that he is more likely to become a millionaire then my student Sam who is going to CMU to study pure math. I could be wrong.

I taught Jacob lots of recursion theory and he won the Westinghouse Award with a paper on Recursive Surreal Numbers; however, the work was all his own and didn't use much of what I taught him. After that I made a policy that for every $1000 a high school student of mine wins I get a free lunch. I emailed Jacob- and while he's not sure he'll give me 3000 lunches, he will treat me to lunch next time I am in Boston.

To quote the article, he won it for

Foundations of higher category theory and derived algebraic geometry for the classification of fully extended topological quantum field theories; and for providing a moduli-theoretic interpretation of elliptic co-homology.

I know what some of those words mean.

- Ken Regan made the cover of Chess Life for his work on catching chess cheaters. (See here.)
- Jacob Lurie who I mentored when he was a high school student 1993-1995 won $3,000,000 for doing pure math. I am not kidding. See here and here. and here. This is a relatively new award called the breakthrough prize.

$3,000,000 is the largest amount of money for prizes of this sort (the Nobel is a paltry $1,200,000). Even so, I had not heard of the award until there was a math one. I wonder if it will become a household name like the Nobel has.

A few weeks ago I told my student Tucker, who is going to Digipen- a 4-year college with a BIG emphasis on programming computer games- that he is more likely to become a millionaire then my student Sam who is going to CMU to study pure math. I could be wrong.

I taught Jacob lots of recursion theory and he won the Westinghouse Award with a paper on Recursive Surreal Numbers; however, the work was all his own and didn't use much of what I taught him. After that I made a policy that for every $1000 a high school student of mine wins I get a free lunch. I emailed Jacob- and while he's not sure he'll give me 3000 lunches, he will treat me to lunch next time I am in Boston.

To quote the article, he won it for

Foundations of higher category theory and derived algebraic geometry for the classification of fully extended topological quantum field theories; and for providing a moduli-theoretic interpretation of elliptic co-homology.

I know what some of those words mean.

## Thursday, June 26, 2014

### Computer Dating

My brother went through his old stuff cleaning out his apartment in New York and stumbled upon the 1981 computer dating questionnaire I wrote in high school. Forgive the dated and adolescent nature of the questions.

The computer science teacher showed us an example of a computer dating form created by some company, a new idea at the time. I decided to run computer dating at the high school, created a questionnaire and passed it around. I did this a few years, the 1981 form must have been the last as that's the year I graduated. In those ancient history days, my fellow students filled out these forms on paper and then I would type the answers into the big teletype machines in the computer lab. I wrote a program to create a matching, I have no idea what algorithm I used but I'm sure it was quite simplistic. I always got matched to the same person, but the pickup line "my computer dating program matched us up" didn't go very far. I did hear of one couple having one (and only one) date because of the program. There's a reason I became a theorist.

My daughters tell me the whole scene has changed with traditional dating quite rare these days. "First you become boyfriend/girlfriend and then you date" which seems so backwards to me.

Now we have websites that use sophisticated machine learning algorithms to connect people. I met my wife the old fashioned way--at a wine and cheese party sponsored by a Boston-area Jewish singles group.

Subscribe to:
Posts (Atom)