Thursday, April 26, 2007

Idiot

CSP stands for COMPUTER SAVY PERSON. 25 years ago the following happened:

BILL: I can't get my computer to work.
CSP: Just push this button you idiot.
BILL: Thanks! That works!

15 years ago the following happened:

BILL: My awk program did not compile and in trying to find the error I found an example from the awk manual which did not compile.
CSP: Just use gawk instead of awk you idiot.
BILL: Thanks! That works! I don't think its idiotic to not know to use gawk.

5 years ago the following happened often:

BILL: My FILL-IN-SOFTWARE is not working, whats the problem?
CSP: We can find a work-around, but, by Rice's theorem, we can't find out what the problem really is.
BILL: Thanks!

Recently the following happened.

BILL: When I play a song on You-Tube I'm not getting sound.
CSP: That will take a few hours to fix. We need to INSERT TECHNO BABBLE.

When they were done sound worked, but many other things did not. And there were some things which I would call odd except that computers doing odd things is not odd. I fired up FIREFOX and a short time later OPEN OFFICE opened mysterious. There was a debate on this.

CSP1: I think FIREFOX is somehow linked to OPEN OFFICE.
CSP2: I think some weird combination of keys caused it.
BILL: Did I do something idiotic like accidentally click on some icon (this turned out to not be the case).

In the good old days computers were simpler. The staff would tell me I was an idiot and fix the problem! The staff now is 10 times better then they were then, 100 times more polite, but computers have gotten VDW(4,2) more complicated.

I miss being an idiot.

23 comments:

  1. I grev up more or less at the same rate as the personal computer gre up, I played with bizzare incompatible things in the early eighties and you had a good chance of understanding what the bits and pieces of your computer were.

    To children born today the gap is much larger, and computers can do much much more than black and white pong. For most people computers are moving into the realm of Clarke's
    "Any sufficiently advanced technology is indistinguishable from magic"

    ReplyDelete
  2. See also Communicating Sequential Processes, Constraint Satisfaction Problem.

    ReplyDelete
  3. For the "Ramsey-theory idiots" out there, the technical translation of VDW(4,2) is: "I don't know exactly how much more complicated computers have gotten, but it's a whole hell of a lot!" :-)

    ReplyDelete
  4. The gap is much larger for kids in absolute terms, but the immersion in computers enables them to navigate at least the software side with ease.

    I have trouble acquiring the coordination to make video-game characters do much more than jump over obstacles and occasionally whirl and punch-or-fire. Whereas I've seen my kids (9 and 12) pick up the next video game quickly given their experience with previous ones. There also good at getting around Windows and MacOS X and MS Word/art-addons in general.

    I observe and theorize that today's teens and college (and now grad-school) population have higher "Gestalt", graphical, and navigational abilities, but on pain of a notable dropoff in "linear" reasoning ability (proofs, discrete math...). They do "E-Search not Research", and tend to read in tree form, not in book form. Darkly I wonder if this effect of growing up with computers and the Net may be more pernicious than TV was for "Why can't Johnny read?", because it is masked by other enhanced abilities and affects rigor and reasoning style directly.

    ReplyDelete
  5. I observe and theorize that today's teens and college (and now grad-school) population have higher "Gestalt", graphical, and navigational abilities, but on pain of a notable dropoff in "linear" reasoning ability (proofs, discrete math...). They do "E-Search not Research", and tend to read in tree form, not in book form. Darkly I wonder if this effect of growing up with computers and the Net may be more pernicious than TV was for "Why can't Johnny read?", because it is masked by other enhanced abilities and affects rigor and reasoning style directly.

    A dropoff in some abilities is not so bad provided that there is a substantial gain along some other dimensions and that gain is useful in some way.

    Additionally, a dropoff for some abilities may not be so bad provided that computers can be used as a replacement for those abilities.

    ReplyDelete
  6. Thag here. Thag's day, everyone live cave, make fire good. These days, kid say "Twigs wet. Hard make fire. Kid go house, play X-Box." Make Thag sad.

    Thag go now, try thump stupid kid.

    ReplyDelete
  7. did anyone already register for FCRC? it's quite costly.

    As Lance claimed you could attend any session on any day you are registered, the best strategy seems to be to register only for EC, which would get you completely covered also for STOC and COMPLEXITY.

    A remarkable 3 for 1 deal!!!

    ReplyDelete
  8. A dropoff in some abilities is not so bad provided that there is a substantial gain along some other dimensions and that gain is useful in some way.

    Additionally, a dropoff for some abilities may not be so bad provided that computers can be used as a replacement for those abilities.



    Sure. But this is not the case here.

    ReplyDelete
  9. Anon 7: is that true? It sure isn't clear from the FRCC webpage...

    ReplyDelete
  10. Sure. But this is not the case here.

    Computers are probably better at proofs than the vast majority of people could ever be.

    It is easy for theoreticians to forget what the intellect of the average person is like.

    ReplyDelete
  11. Anonymous says: Computers are probably better at proofs than the vast majority of people could ever be. It is easy for theoreticians to forget what the intellect of the average person is like.

    Which is to say, the "average person's intellect" is pretty much exactly like the intellect of the average theoretician!

    The book A=B (available on-line) has a wonderful introduction by Donald Knuth on this topic.

    The book A=B itself has many examples of results that would be thought wonderful ... if they had been derived by a human theoretician!

    ReplyDelete
  12. Which is to say, the "average person's intellect" is pretty much exactly like the intellect of the average theoretician!

    Not if you substitute "average successful theoretician".

    Just what is the average IQ of people who publish papers in STOC/FOCS?

    ReplyDelete
  13. Ask all the knowing google - and it provides with an answer and does not call you names!

    ReplyDelete
  14. thank you very very nıce thankyou very very much...

    ReplyDelete
  15. The problem is that computers used to not work very well, so no one was surprised when they failed. It usually wasn't that hard to get them to kinda work again, so we felt stupid when we saw how easy it was. Now we tend to think that computers actually work, and are surprised when they don't. And when they don't, it's not so easy to get them working again.

    -- from one who can attest that Bill actually wasn't an idiot in the old days :)

    ReplyDelete
  16. > I miss being an idiot.

    Perhaps you still are an idiot.

    ReplyDelete
  17. We, the readers of Computational Complexity, are eagerly awaiting the next Podcast. When is it going to happen?

    ReplyDelete
  18. Perhaps you still are an idiot.

    Just want to mention that such stupid comments are stupid!!

    ReplyDelete
  19. Complexity for idiots:

    As an outsider with some interest in complexity, i've got a few very basic/idiotic questions -
    couldn't find answers in standard textbooks/papers (though didn't try too hard) :


    1. What is the best known running time for an algorithm which solves an NP-complete problem?
    I guess something like 'enumerate all possibilities' give you ~2^n. Is it possible to do better?
    e.g. \alpha^n for \alpha<2 ? or 2^{n/2} or even 2^{\sqrt(n)} ?

    2. What is the best lower bound known for an algorithm which solves an NP-complete problem?
    I know there are good lower bounds for restricted models of computations (e.g. all kinds of circuits),
    but what is the state of the art for algorithms? Is it n^2? some other power of n?

    3. Could the following situation be true? NP=P, but for every natural number k,
    there exists a problem such that the running time of any algorithm solving it is at least O(n^k) ?
    Or, alternatively, does NP=P imply that there is some universal k (e.g. k=17) such that for all problems in P
    we've got an algorithm with running time O(n^k)?


    My hunch is that other mathematicians/scientists may have more questions which are at this level (I feel no longer
    embarrassed about my ignorance after talking to a FEW physicists still thinking that NP is 'Not Polynomial')
    could this blog be a good place to discuss them? (If there's some other place suitable for this i'd be happy to know)


    Thanx,
    Complexity laymen

    ReplyDelete
  20. In reverse order:

    3. The time hierarchy theorem (which should be in any book on complexity or computability theory) tells us in particular that for every k there are problems solvable in time n^k but not time n^{k-1}. This is one of the few things we know without any assumptions!

    2. I don't know the "best" answer to this question, though my favorite is the proof that clique require super-polynomial size *monotone* circuits. You need to look at individual problems here for the reasons discussed below.

    1. I think you have to ask on a problem-by-problem basis. To see this, note that by padding an NP-complete problem with sufficiently many 0s you can always construct an NP-complete problem solvable in time n^\epsilon for any positive \epsilon.

    ReplyDelete
  21. Regarding point #2: There are lots of results that prove superpolynomial lower bounds for particular approaches to solving NP-complete problems.

    For example, there is a subfield called "proof complexity" which shows that for specific proof systems, proofs that certain CNFs are unsatisfiable require superpolynomial size. While this does not clearly say anything about all possible satisfiability algorithms, it does give lower bounds on many approaches that are used in practice.

    See the survey "Lengths of Propositional Proofs" by Alasdair Urquhart.

    ReplyDelete
  22. Complexity layman wrote:

    2. What is the best lower bound known for an algorithm which solves an NP-complete problem?
    I know there are good lower bounds for restricted models of computations (e.g. all kinds of circuits),
    but what is the state of the art for algorithms? Is it n^2? some other power of n?


    To add to what responder 20 wrote about the Time Hierarchy theorem, one can also construct a language L that is in NTIME[n^k] but not in NTIME[o(n^k)] (for multitape TMs we may need to fuzz this up by a logn factor or so). Then L+SAT (i.e., {0x : x \in L} U {1x: x \in SAT} is NP-complete and has a lower bound of time n^k for NTIME. hence also for DTIME.

    But most of the classic NP-complete problems actually live in nondeterministic linear time (NLIN). Yet of those, only barely-superlinear lower bounds (coming from NLIN != DLIN for multitape TMs) are known, and only for a few such problems!

    This and other concrete lower bound issues are treated in my Oct. 1993 SIGACT News survey article, which is still up to date to a depressing degree :-0!

    ReplyDelete
  23. Thanks dudes,


    If i got it correctly, then by the time-hierarchy thm. it is guaranteed that there are
    problem in P which requires say n^1000 time to solve, yet the proof is not constructive,
    and there are no 'natural' problems for which we know that they take more then say n^2?
    (when any algorithm allowed, not on a restricted circuit model).

    What then is the best gap known between finding a solution and verifying it?
    can one prove that there exist a problem for which finding the solution takes n^1000,
    but verifying it takes n^2 or something like this?

    Complexity laymen

    ReplyDelete