tag:blogger.com,1999:blog-37222332015-03-04T21:35:34.205-06:00Computational ComplexityComputational Complexity and other fun stuff in math and computer science from Lance Fortnow and Bill GasarchLance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.comBlogger2255125tag:blogger.com,1999:blog-3722233.post-11236355999729444012015-03-02T14:46:00.000-06:002015-03-02T14:46:29.827-06:00Leonard Nimoy (1931-2015)<div class="separator" style="clear: both; text-align: center;">
<a href="http://2.bp.blogspot.com/-vAZdHhfPsY4/VPTL0FZ2-MI/AAAAAAAA0UM/-I_nORUSvXU/s1600/spock.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="http://2.bp.blogspot.com/-vAZdHhfPsY4/VPTL0FZ2-MI/AAAAAAAA0UM/-I_nORUSvXU/s1600/spock.jpg" height="116" width="400" /></a></div>
<br />
Bill and I rarely write joint blog posts but with the loss of a great cultural icon we both had to have our say.<br />
<br />
<b>Bill:</b> Leonard Nimoy (Spock) died last week at the age of 83. DeForest Kelley (McCoy) passed away in 1999. William Shatner (Kirk) is still alive, though I note that he is four days older than Nimoy.<br />
<br />
Spock tried to always be logical. I wonder if an unemotional scientist would be a better or worse scientist.<br />
Does emotion drive our desire to learn things? Our choice of problems to work on? Our creativity?<br />
<br />
Did Star Trek (or its successors) inspire many to go into science? Hard to tell but I suspect yes. Did it inspire you?<br />
<br />
There depiction of technology ranged from predicative (communicators are cell phones!) to awful (Episode 'The Ultimate Computer' wanted to show that humans are better than computers. It instead showed that humans are better than a malfunctioning killer-computer. I think we knew that.) I think TV shows now hire science consultants to get things right (The Big Bang Theory seems to get lots of science right, though their view of academia is off.) but in those days there was less of a concern for that.<br />
<br />
<b>Lance:</b> I'm too young to remember the original Star Trek series when it first aired but I did watch the series religiously during the 70's when a local TV station aired an episode every day, seeing every episode multiple times. The original Star Trek was a product of its time, using the future to reflect the current societal issues of the 60's. Later Star Trek movies and series seemed to have lost that premise.<br />
<br />
Every nerdy teenager, myself included, could relate to Spock with his logical exterior and his half-human emotional interior that he could usually suppress. Perhaps my favorite Spock episode was the penultimate "All Our Yesterdays" where Spock having been sent back in time takes on an earlier emotional state of the old Vulcans and falls in love.<br />
<br />
I did see Leonard Nimoy in person once, during a lecture at MIT in the 80's. He clearly relished being Spock and we all relished him.<br />
<br />
Goodby Leonard. You have lived long and prospered and gone well beyond where any man has gone before.Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com0tag:blogger.com,1999:blog-3722233.post-41517567124879906032015-02-26T16:22:00.000-06:002015-02-26T16:23:08.129-06:00Selecting the Correct OracleAfter my <a href="http://blog.computationalcomplexity.org/2015/02/and-winners-are.html">post last week</a> on the Complexity accepts, a friend of Shuichi Hirahara send Shuichi an email saying that I was interested in his paper. Shuichi contacted me, sent me his paper and we had a few good emails back and forth. He posted his paper <a href="http://arxiv.org/abs/1502.07258">Identifying an Honest EXP<sup>NP</sup> Oracle Among Many</a> on the arXiv yesterday.<br />
<div>
<br /></div>
<div>
Shuichi asks the following question: Given two oracles both claiming to compute a language L, figure out which oracle is correct. For which languages does there exist such a selector?</div>
<div>
<br /></div>
<div>
For deterministic polynomial-time selectors, every such L must sit in PSPACE and all PSPACE-complete languages have selectors. The question gets much more interesting if you allow probabilistic computation.</div>
<div>
<br /></div>
<div>
Shuichi shows that every language that has a probabilistically poly-time selector sits in S<sub>2</sub><sup>EXP</sup>, the exponential analogue of <a href="http://blog.computationalcomplexity.org/2002/08/complexity-class-of-week-s2p.html">S<sub>2</sub><sup>P</sup></a>. His main theorem shows that EXP<sup>NP</sup>-complete sets have this property. His proof is quite clever, using the EXP<sup>NP</sup>-complete problem of finding the lexicographically least witness of a succinctly-described exponential-size 3-SAT question. He uses PCP techniques to have each oracle produce a witness and then he has a clever way to doing binary search to find the least bit where these witnesses differ. I haven't checked all the details carefully but the proof ideas look good.</div>
<div>
<br /></div>
<div>
Still leaves an interesting gap between EXP<sup>NP</sup> and S<sub>2</sub><sup>EXP</sup>. Is there a selector for Promise-S<sub>2</sub><sup>EXP</sup>-complete languages?</div>
Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com1tag:blogger.com,1999:blog-3722233.post-6312709626999915252015-02-23T11:11:00.001-06:002015-02-23T11:11:29.875-06:00Eliminate the Postal ServiceIt's gotten very difficult to mail a letter these days. There are no mailboxes along my ten mile commute. Georgia Tech has contracted with an outside company to handle outgoing mail. To send a piece of mail requires filling out a form with an account number and many other universities have similar practices. Mail into or out of the university can tack on several days. I sent a piece of mail from Georgia Tech in Atlanta to the University of Pennsylvania in Philadelphia--two weeks from sender to recipient.<br />
<br />
Why do I have to send mail in this world with email, texts and instant messages? Some places require "original receipts". Some government agencies require forms sent by mail or fax, and I've given up trying to find a reliable fax machine with someone who knows how to work it. It's still not always easy to transfer money to another person or company with a physical check. I stopped using the Netflix DVD service because it lost its value when I had to make a special trip to mail the DVD back. It's easier to find a Redbox than a mailbox.<br />
<br />
Meanwhile most of the mail I receive is junk, or magazines, which look better on the iPad, or official letters that I have to scan to keep an electronic copy since they didn't email it to me. I do get the occasional birthday card or hand-written thank you note, a nice Southern tradition but we can live without it. USPS also does package delivery but that is often handled better by private provider such as UPS and FedEx.<br />
<br />
So what if we just eliminated the US Postal System, say with a three-year warning? There is nothing that can't be replaced by electronic means and a planned closing would force the various government and businesses make that final push. We'll reminisce about mail like we did about the telegram. But why keep an inferior technology alive? It's time to move on.Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com8tag:blogger.com,1999:blog-3722233.post-19154488270828506202015-02-19T06:06:00.000-06:002015-02-20T09:27:05.206-06:00And the Winners Are...[Shortly after this post went up, <a href="http://acm-stoc.org/stoc2015/">STOC</a> announced their <a href="http://acm-stoc.org/stoc2015/acceptedpapers.html">accepted papers</a> as well]<br />
<br />
I always like that the announcement of <a href="http://www.computationalcomplexity.org/Archive/2015/accept.html">accepted papers</a> for the <a href="http://www.computationalcomplexity.org/">Computational Complexity Conference</a> happens around the time of the Academy Awards. These acceptances are the Oscars for our community that shares its name with this conference and the blog.<br />
<br />
The title that interests me is <i>Identifying an honest EXP^NP oracle among many </i>by Shuichi Hirahara since it seems closely related to some of my own research. Not only cannot I not find the paper online, I can't even find the author's email. Shuichi, if you reading this, please send me a copy of your paper.<br />
<br />
Luckily not all papers are so elusive. Murray and Williams <a href="http://eccc.hpi-web.de/report/2014/164/">show</a> that proving the NP-hardness of computing the circuit complexity would require proving real complexity class separation results. Oliveira and Santhanam <a href="http://eccc.hpi-web.de/report/2014/173/">give</a> tight lower bounds on how much you can compress majority so that you can compute it with constant-depth circuits. A different Oliveira has two papers in the conference, a solely authored paper <a href="http://www.cs.princeton.edu/~rmo/papers/small-depth-factors.pdf">showing</a> that polynomials of low individual degree with small low-depth arithmetic circuits have factors similarly computable, and a <a href="http://arxiv.org/abs/1411.7492">paper</a> with Shpilka and Volk on hitting sets for bounded-depth multilinear formula. A hitting set is a small easily and deterministically computable set that contains, for every such arithmetic circuit, an input with a non-zero output.<br />
<br />
Many more <a href="http://www.computationalcomplexity.org/Archive/2015/accept.html">interesting papers</a> and you can see them all at the conference in Portland, this year part of the <a href="http://fcrc.acm.org/">Federated Computing Research Conference</a> which includes STOC, SPAA and EC, which now stands for Economics and Computation. My tip: <a href="http://fcrc.acm.org/travel.cfm">book your hotels</a> now, they fill up fast.Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com5tag:blogger.com,1999:blog-3722233.post-73410875245873282762015-02-17T09:43:00.003-06:002015-02-17T09:44:44.698-06:00Stephan Colbert, Jon Stewart, Andrew Sullivan, William GasarchStephan Colbert is leaving the Colbert Report<br />
<br />
Jon Stewart is leaving the Daily Show<br />
<br />
Andrew Sullivan is no longer blogging<br />
<br />
Bill Gasarch has resigned as SIGACT News Book Review Editor<br />
<br />
Where will Gasarch get his news from?<br />
<br />
Where will Colbert-Stewart-Sullivan get their reviews-of-theory-books from?<br />
<br />
<br />
Why am I stepping down? I've been SIGACT News book Review editor for 17 years (just as long as Jon Stewart has been doing The Daily Show.) That's enough (more than enough?) time. I want time to spend more time with my books and my family. I will be on sabbatical next year so I am generally cutting down on my obligations. <br />
<br />
I have enjoyed it, gotten to know some publishers, got more free books than I know what to do with. I haven't paid for a math or CS book in... probably 17 years.<br />
<br />
While writing reviews is great, figuring out who reviews other books, getting them the books, getting the review from them, editing it all into a column four times a year, can get to be routine. Though I DO like reading the reviews.<br />
<br />
Who will take over? I asked Lance who would be good and he said `someone old who still reads books'- so I asked Fred Green who agreed to take the job. I then had to get my files (of reviews, of who-owes-me-reviews, of which-books-do-I-want-reviewed, etc) in order to email to him. The usual- I wish I had cleaned up my files years ago so I could benefit from it.<br />
<br />
The main PLUS of the job was that I got to read lots of books and learn about some fields. As someone who would rather read a good book rather than produce a bad paper, the job suited me.<br />
<br />
The main NEGATIVE of the job was seeing so many books that I WANT to review but either didn't have time to (and I usually KNEW that and had someone else review it) or found myself unable to (gee that books is harder than I thought!) leave my office for someone else to review.<br />
<br />
The biggest change that Fred will encounter is e-books.Will publishers want to send out free e-books instead of hardcopy? Will reviewers want hardcopy? This is of course a very tiny part of a more profound conversation of what will happen to the book market once e-books are more common. <br />
<br />
<br />GASARCHhttp://www.blogger.com/profile/06134382469361359081noreply@blogger.com4tag:blogger.com,1999:blog-3722233.post-4851228262441711002015-02-11T06:23:00.000-06:002015-02-11T06:26:46.264-06:00Richard Hamming Centenary<div class="separator" style="clear: both; text-align: center;">
</div>
<a href="http://www-history.mcs.st-andrews.ac.uk/Biographies/Hamming.html">Richard Hamming</a> was born a hundred years ago today in Chicago. He worked on the Manhattan Project during World War II, moved to Bell Labs after the war and started working with Claude Shannon and John Tukey. It was there that he wrote his seminal 1950 paper <a href="http://dx.doi.org/10.1002/j.1538-7305.1950.tb00463.x">Error detecting and error correcting codes</a>.<br />
<div>
<br /></div>
<div>
Suppose you send a string of bits where a bit might have been flipped during the transmission. You can add an extra parity bit at the end that can be used to detect errors. What if you wanted to correct the error? Richard Hamming developed an <a href="http://en.wikipedia.org/wiki/Hamming(7,4)">error-correcting code</a> (now called the Hamming code) that encodes 4 bits into 16 codewords of 7 bits each such that every two codewords differ in at least three bits (which we now call the Hamming distance). </div>
<div>
<br /></div>
<pre>0000000 1101001 0101010 1000011
1001100 0100101 1100110 0001111
1110000 0011001 1011010 0110011
0111100 1010101 0010110 1111111
</pre>
<div>
If there is a single error then there is a unique codeword within one bit of the damaged string. By having an error-correcting code you can continue a process instead of just halting when you detect a bad code.</div>
<div>
<br /></div>
<div>
The Hamming code is a linear code, the bitwise sum mod 2 of any two codewords is another codeword. This linear idea led to many more sophisticated codes which have had many applications in computer science, practical and theoretical.</div>
<div>
<br /></div>
<div>
Hamming received several awards notably the <a href="http://amturing.acm.org/award_winners/hamming_1000652.cfm">1968 ACM Turing Award</a> and the inaugural <a href="http://www.ieee.org/about/awards/medals/hamming.html">IEEE Richard W. Hamming Medal</a> in 1988 given for exceptional contributions to information sciences, systems, and technology. Hamming <a href="http://www.nytimes.com/1998/01/11/business/richard-hamming-82-dies-pioneer-in-digital-technology.html">passed away</a> in 1998. </div>
Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com2tag:blogger.com,1999:blog-3722233.post-79399640390881011062015-02-08T21:24:00.000-06:002015-02-08T21:24:28.426-06:00Pros and Cons of students being wikiheadsA <i>wikihead</i> is someone who learns things from the web (not necc. Wikipedia) but either learns things that are not correct or misinterprets them. I've also heard the term <i>webhead</i> but thats ambigous since it also refers to fans of Spiderman.<br />
<br />
I like to end the first lecture of Discrete Math by talking about SAT and asking the students if they think it can be solved <i>much faster than the obvious 2^n algorithm.</i> This semester in honors DM I got the usual heuristics (look for a contradiction!) that may well help but certainly won't get down to much better than 2^n in all cases. This leads to nice discussions of worst-case vs avg-case and formal vs what-works-in-practice.<br />
<br />
I also got the following answers:<br />
<br />
SAT cannot be done better than 2^n since P ≠ NP.<br />
<br />
and<br />
<br />
SAT can be done in O(n) time with a Quantum Computer.<br />
<br />
They both made there assertions boldly! I gently corrected them. They had both <i>read it on the web.</i><br />
<br />
I suspect that the P ≠ NP person read something that was correct (perhaps a survey that said 80% of all theorists THINK P ≠ NP) and misconstrued it, while the second person read something that was just wrong (perhaps one of those <i>by the many worlds quantum theory a quantum computer can look at all possibilities at the same time</i> people).<br />
<br />
SO- they went and looked up stuff on their own (YEAH) but didn't quite understand it (BOO)<br />
or read incorrect things (BOO). But I will correct them (YEAH). But there are other people who will never get corrected (BOO). But there are others who will get interested in these things because of the false things they read (YEAH?) The quantum person might either NOT go into quantum computing since he thinks its all bogus now, or GO INTO it since he is now curious about what is really going on.<br />
<br />
SO the real question is: if people get excited about math or science for the wrong reasons, is that good?bad? Do you know of examples where incorrect but exciting science writing lead to someone doing real science?<br />
<br />
<br />
<br />
<br />
<br />GASARCHhttp://www.blogger.com/profile/06134382469361359081noreply@blogger.com4tag:blogger.com,1999:blog-3722233.post-43999669637588223712015-02-05T06:43:00.000-06:002015-02-05T06:43:04.226-06:00Computers Turn Good and EvilEntertainment Weekly proclaimed 2015 the year that <a href="http://www.ew.com/article/2014/12/22/aventers-ultron-terminator-genisys-artificial-intelligence">artificial intelligence will rule the (movie) world</a> with talk of the <i>Terminator</i> reboot, the new Avengers movie <i><a href="http://www.imdb.com/title/tt2395427/">Age of Ultron</a></i>, where Ultron is an attempt at a good AI robot turned evil, and <i><a href="http://www.sonypictures.com/movies/chappie/">Chappie</a></i> who saves the world. And then there is <a href="http://www.imdb.com/title/tt0470752/">Ex Machina</a>, where Domhnall Gleeson "must conduct a Turing test, the complex analysis measuring a machine’s ability to demonstrate intelligent human behavior, while wrestling with his own conflicted romantic longing for the humanoid."<br />
<br />
Let's not forget the return of the <i>Star Wars</i> droids and the hacker movie <i>Blackhat</i> that has already come and gone. On TV we have new computer-based procedurals, one for adults (<i><a href="http://www.cbs.com/shows/csi-cyber/">CSI:Cyber</a></i>) and one for kids (<i><a href="http://www.amazon.com/Buddy-Tech-Detective-HD/dp/B00RSGIAEW">Buddy: Tech Detective</a></i>).<br />
<br />
With Wired proclaiming <a href="http://www.wired.com/2015/01/ai-arrived-really-worries-worlds-brightest-minds/">AI Has Arrived, and That Really Worries the World’s Brightest Minds</a> and Uber <a href="http://bits.blogs.nytimes.com/2015/02/02/uber-to-open-center-for-research-on-self-driving-cars/">investing in technology</a> to eliminate their drivers, what is the average person to think. Let me quote the famous philosopher Aaron Rodgers and say "Relax". We still control the technology, don't we?Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com0tag:blogger.com,1999:blog-3722233.post-32716008087948391932015-02-02T10:35:00.001-06:002015-02-02T10:35:15.493-06:00Less elegant proofs that (2n)!/n!n! is an integer(A conversation between Bill (the writer of this post) and Olivia (14-year old daughter of a friend.) All of the math involved is <a href="http://www.cs.umd.edu/~gasarch/BLOGPAPERS/nchoosek.pdf">here</a>.<br />
<br />
Bill: Do you know what 10! is?<br />
<br />
Olivia: Yes, when I turned 10 I yelled out ``I AM 10!''<br />
<br />
Bill: I mean it in a different way. In math 10! is 10 x 9 x 8 x 7 x 6 x 5 x 4 x 3 x 2.<br />
<br />
Olivia: Okay. So what?<br />
<br />
Bill: Do you think that (10)!/5!5! is an integer?<br />
<br />
Olivia: No but I'll try it. (She does) Well pierce my ears and call be drafty, it is!<br />
<br />
Bill: Do you think that (100)!/50!50! is an integer?<br />
<br />
Olivia: Fool me once shame on you, Fool me twice... uh, uh, We don't get fooled again was a great song by the Who!<br />
<br />
Bill: Who? Never mind, I'll save a whose-on-first-type blog for another day.<br />
<br />
Olivia: What?<br />
<br />
Bill: He's on second, but never mind that. It turns out that (1) (100!)/50!50! is an integer, and (2) I can prove it without actually calculating it. (Bill then goes through combinatorics and shows that n!/k!(n-k)! solves a problem in combinatorics that must have an integer solution.)<br />
<br />
Olivia: You call that a proof! That's INSANE You can't just solve a problem that must have an integer solution and turn that into a proof that the answer is an integer. Its unnatural. It is counter to the laws of God and Man!<br />
<br />
Inspired by Olivia I came up with a LESS elegant proof that (2n)!/n!n! is always an integer. Inspired by that I also came up with a LESS elegant proof that the Catalan numbers are integers. See the link above for all proofs.<br />
<br />
But realize- WE think it is OBVIOUS that (2n)!/n!n! is an integer (and for that matter n!/k!(n-k)! and the Catalan numbers) and we are right about that--- but there was a time when we would have reacted like Olivia.<br />
<br />
I've seen 5th graders who were sure there must be a fraction whose square is 2 since (1) I wouldn't have asked them if there wasn't, and (2) the concept of a problem not having a solution was alien to them. In an earlier grade the concept of having a problem whose answer was negative or a fraction was alien to them.<br />
<br />
When teaching students and daughters of friends we should be aware that what we we call obvious comes from years of exposure that they have not had.<br />
<br />
When Olivia turns 15 will she say `I AM 15!' More accurate would be `I am 15!.'GASARCHhttp://www.blogger.com/profile/06134382469361359081noreply@blogger.com4tag:blogger.com,1999:blog-3722233.post-5638457133720775832015-01-29T06:53:00.001-06:002015-01-29T06:53:35.291-06:00Reusing Data from PrivacyVitaly Feldman gave a talk at Georgia Tech earlier this week on his recent paper <a href="http://arxiv.org/abs/1411.2664">Preserving Statistical Validity in Adaptive Data Analysis</a> with Cynthia Dwork, Moritz Hardt, Toniann Pitassi, Omer Reingold and Aaron Roth. This work looks at the problem of reuse of the cross-validation data in statistical inference/machine learning using tools from differential privacy.<br />
<br />
Many machine learning algorithms have a parameter that specifies the generality of the model, for example the number of clusters in a clustering algorithm. If the model is too simple it cannot capture the full complexity of what it is learning. If the model is too general it may overfit, fitting the vagrancies of this particular data too closely.<br />
<br />
One way to tune the parameters is by cross-validation, running the algorithm on fresh data to see how well it performs. However if you always cross-validate with the same data you may end up overfitting the cross-validation data.<br />
<br />
Feldman's paper shows how to reuse the cross-validation data safely. They show how to get an exponential (in the dimension of the data) number of adaptive uses of the same data without significant degradation. Unfortunately their algorithm takes exponential time but sometimes time is much cheaper than data. They also have an efficient algorithm that allows a quadratic amount of reuse.<br />
<br />
The intuition and proof ideas come from differential privacy where one wants to make it hard to infer individual information from multiple database queries. A standard approach is to add some noise in the responses and the same idea is used by the authors in this paper.<br />
<br />
All of the above is pretty simplified and you should <a href="http://arxiv.org/abs/1411.2664">read the paper</a> for details. This is one of my favorite kinds of paper where ideas developed for one domain (differential privacy) have surprising applications in a seemingly different one (cross-validation).<br />
<br />
<br />Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com1tag:blogger.com,1999:blog-3722233.post-10221749790728996102015-01-26T14:43:00.002-06:002015-01-26T14:43:42.125-06:00A nice problem from a Romanian Math Problem Book<br />(All of the math for this problem is <a href="http://www.cs.umd.edu/~gasarch/BLOGPAPERS/recipsq.pdf">here)</a><br />My Discrete Math Honors TA Ioana showed me a Romanian Math Problem book<br />(She is Romanian) and told the following problem:<br />
<br />
<br />
(All a<sub>i</sub> in this post are assumed to be natural numbers)<br />
<br />Show that for all n ≥ 6 there exists (a<sub>1</sub>,...,a<sub>n</sub>) such that 1/a<sub>1</sub><sup>2</sup> + ... + 1/a<sub>n</sub><sup>2</sup> = 1.<br />
<br />
(sum of reciprocals squared)<br />
<br />Normally my curiosity exceeds my ego and I would look up the answer.<br />But it was in Romanian! Normally I would ask her to read the answer to me.<br />But I was going out of town! Normally I would look it up the answer on the<br />web. But this is not the kind of thing the web is good at!<br />
<br />So I did the obvious thing- worked on it while watching Homeland Season 2<br />the first four episodes. And I solved it! Either try to solve it yourself<br />OR goto the link.<br /><br />Some possibly open questions come out of this<br /><br />1) I also prove that, for all k there is an n<sub>0</sub>=n<sub>0</sub>(k) such that<br /><br />all n ≥ n0 there exists (a<sub>1</sub>,...,a<sub>n</sub>) such that1/a<sub>1</sub><sup>k</sup>+ ... + 1/a<sub>n</sub><sup>k </sup>= 1.<br />
<br />
<br />(sum of reciprocal kth powers)<br />
<br />We showed above that n0(2) ≤ 6, its easy to show n<sub>o</sub>(2) ≥ 6, so n<sub>0</sub>(2)=6.<br /><br />Obtain upper and lower bounds on n0(k).<br /><br />2) What is the complexity of the following problem:<br />
<br />Given k,n find out if there exists (a<sub>1</sub>,...,a<sub>n</sub>) such that1/a<sub>1/</sub><sup>k</sup> + ... + 1/a<sub>n</sub><sup>k</sup> = 1.<br /><br />If so then find the values (a<sub>1</sub>,...,a<sub>n</sub>).<br /><br />(We know of an example where the Greedy method does not work.)<br /><br />3) What is the complexity of the following problem: Just as above<br />
but now we want to know HOW MANY solutions.<br />
<br />4) Meta question: How hard are these questions? The original one was<br />on the level of a high school or college math competition. The rest<br />might be easy or hard. I suspect that getting an exact formula for<br />n<sub>0</sub>(k) is hard. I also suspect that proving that this is hard<br />
will be hard.<br /><br /><br />GASARCHhttp://www.blogger.com/profile/06134382469361359081noreply@blogger.com5tag:blogger.com,1999:blog-3722233.post-67419481688014865252015-01-22T07:03:00.002-06:002015-01-22T07:03:55.369-06:00There Should be an AlgorithmMy high school daughter Molly was reading her Kindle and said "You know how you can choose a word and the Kindle will give you a definition. There should be an algorithm that chooses the right definition to display depending on the context". She was reading a book that took place in the 60's that referred to a meter man. This was not, as the first definition of "meter" would indicate, a 39 inch tall male. A meter man is the person who records the electric or gas meter at your house. Today we would use a gender neutral term like "meter reader" if technology hadn't made them obsolete.<br />
<br />
Molly hit upon a very difficult natural language processing challenge known as <a href="http://en.wikipedia.org/wiki/Word-sense_disambiguation">word-sense disambiguation</a> with the most successful approaches using supervised machine learning techniques. If anyone from Amazon is reading this post, the Kindle dictionary would make an excellent application of word-sense disambiguation. You don't need perfection, anything better than choosing the first definition would be welcome. Small tweaks to the user interface where the reader can indicate the appropriate definition would give more labelled data to produce better algorithms.<br />
<br />
And to Molly: Keep saying "There should be an algorithm". Someday there might be. Someday you might be the one to discover it.Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com7tag:blogger.com,1999:blog-3722233.post-6345130350309092172015-01-19T08:33:00.003-06:002015-01-19T08:34:39.871-06:00The two defintions of Chomsky Normal FormI have eight textbooks on Formal Lang Theory on my desk. Six of them define a CFG to be in <i>Chomsky Normal Form</i> if every production is of the form either A-->BC or A--> σ (σ a single letter). With that definition one can show that every e-free grammar can be put in Chomsky Normal Form, and using that, show that CFL ⊆ P. There is a very minor issue of what to do if the CFL has e in it.<br />
<br />
Two of the books (Sipers and Floyd-Beigel) define a CFG to be in <i>Chomsky Normal Form </i>if every rule is A-->BC or A--> σ OR S-->e and also that S cannot appear as one of the two nonterminals at the right hand side of a production of the form A-->BC. With this definition you can get that every CFL (even those with e in them) has a grammar in Chomsky Normal Form.<br />
<br />
The definitions are NOT equivalent mathematically, but they are equivalent in spirit and both aim towards the same thing: getting all CFL's in P (That's why I use them for. What did Chomsky used them for?)<br />
<br />
The first definition is correct historically- its what Chomsky used (I am assuming this since it's in the earlier books). The second one could be argued to be better since when you are done you don't have to deal with the e. I still like the first one, but its six of one, half a dozen of the other. One person's floor function is another person's ceiling function.<br />
<br />
I don't have a strong opinion about any of this, but I will say that if you use the second definition then you should at least note that there is another definition that is used for the same purpose. Perhaps make a homework out of this.<br />
<br />
There are many other cases where definitions change and the new one leads to more elegant theorems and a better viewpoint than the old one. There are even cases where the new definition IS equivalent to the old one but is better. IMHO the (∃) definition of NP is better than the definition that uses nondeterministic Poly Time TM's since this leads to the definition of the poly hierarchy. One drawback- if you want to define NL then I think you DO need nondeterministic TM's.<br />
<br />
The only problem I see with changing definitions is if you are reading an old paper and don't quite know which definition they are using. <br />
<br />
What examples of a definition changing (or an equivalent one being more used) do you approve of? Disapprove of?<br />
<br />
<br />GASARCHhttp://www.blogger.com/profile/06134382469361359081noreply@blogger.com3tag:blogger.com,1999:blog-3722233.post-30245621173057663562015-01-15T12:34:00.000-06:002015-01-16T20:39:56.811-06:00The Impact Factor DiseaseThe Institute of Science Information (ISI) was founded in 1960 to help index the ever growing collection of scientific journals. The founder of ISI, Eugene Garfield, developed a simple impact factor to give a rough estimate of quality and help highlight the more important journals. Roughly the <a href="http://en.wikipedia.org/wiki/Impact_factor">impact factor</a> of a journal in year x is the average number of citations each article from years x-1 and x-2 receives in year x.<br />
<br />
Thomson Scientific bought out ISI in 1992 and turned the data collection into a business. Impact factors are not only being used to measure the quality of journals but of authors (i.e. researchers) and institutions as well. In many parts of the world a person's research quality is being measured strongly or sometimes solely by their ability to publish in high impact factor journals.<br />
<br />
This is bad news for computer science since conference proceedings in our field have historically more prestige than journals. We mitigate the ISI factors pretty well in the US but in many other countries this puts computer science at a disadvantage. The need for impact factor publications is one of the reasons conferences are experimenting with a hybrid model.<br />
<br />
In a new trend I get many announcements of journals highlighting their ISI impact factor, mostly very general and previously unknown to me. Our old friends WSEAS say "The ISI Journals (with Impact Factor from Thomson Reuters) that publish the accepted papers from our Conferences are now 32" in the subject line of their email.<br />
<br />
It's the vanity journal with a twist: publish with us and you'll raise your official numerical prestige. So we get a set of journals whose purpose is to raise the value of researchers who should have their value lowered by publishing in these journals.<br />
<br />
Raise your research standing the old fashioned way: Do great research that gets published in the best venues. The numbers will take care of themselves.Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com6tag:blogger.com,1999:blog-3722233.post-65731557766089818882015-01-13T07:23:00.002-06:002015-01-13T07:23:31.664-06:00Barrier's for Matrx Mult Lower bound<br />
Matrix Mult:<br />
<br />
The usual algorithm is O(n^3).<br />
<br />
Strassen surprised people by showing an O(n^{2.81}) algorithm. (Now its so well known that its not surprising. I try to teach it to people who don't already know its true so they'll be surprised.)<br />
<br />
Over the years the exponent came down, though there was a long time between Coopersmith and Winograd's exp of 2.3755 and the recent improvements.<br />
<br />
Are these improvements going to lead to the exp 2+epsilon?<br />
<br />
Alas the answer is prob no :-(<br />
<br />
In <a href="http://eccc.hpi-web.de/report/2014/154/">Fast Matrix Mult: Limitations of the Laser Method</a> Ambainis, Filmus, and Le Gall show that the methods that lead to the algorithms above will NOT lead to an exp of 2+epsilon.<br />
<br />
How to react to news like this? Barriers should not make us give up; however, they should make us look for new techniques, perhaps guided by the barrier. I would like to think that if you know where NOT to look then it helps you know where TO look.<br />
<br />
There have been barriers that were broken (IP=PSPACE didn't relativize, the recent 2-prover PIR used techniques NOT covered by the barrier result, and the Erdos distance problem was proven after it was known that the current techniques `wouldn't work'). I am sure there are other examples, feel free to leave some in the comments<br />
<br />
Will this result help guide researchers in the right direction? Lets hope so!<br />
<br />
Of course, its possible that 2+epsilon is false and the barrier result is a first step in that direction.<br />
<br />
Which one am I routing for? Neither- I am routing for finding out!<br />
<br />
bill g.GASARCHhttp://www.blogger.com/profile/06134382469361359081noreply@blogger.com8tag:blogger.com,1999:blog-3722233.post-8500876548183166032015-01-08T07:28:00.000-06:002015-01-08T07:28:08.648-06:00The History of the History of the History of Computer ScienceIn 2007, the science historian Martin Campbell-Kelly wrote an article <a href="http://www.thecorememory.com/THTHS.pdf">The History of the History of Software</a>, where he writes about how he initially wrote histories of the technical aspects of computer software back in the 70's but now he's evolved into writing more about the applications and implications of software technologies. He argues that the whole field of the history of software has moved in the same directions.<br />
<div>
<br /></div>
<div>
Donald Knuth made an emotional argument against this trend last May in his Stanford Kailath lecture <a href="http://kailathlecture.stanford.edu/2014KailathLecture.html">Let's Not Dumb Down the History of Computer Science</a>. If you can find an hour, this is a video well worth watching.<br />
<br /></div>
<!--
<center>
<iframe allowfullscreen="" frameborder="0" height="315" src="//www.youtube.com/embed/gAXdDEQveKw?rel=0" width="420"></iframe></center>
-->
In the January CACM Thomas Haigh gave his thoughts in <a href="http://cacm.acm.org/magazines/2015/1/181633-the-tears-of-donald-knuth/fulltext">The Tears of Donald Knuth</a>. Haigh argues that Knuth conflates the History of Computer Science with the History of Computing. Haigh says that historians focus on the latter and the History of Computer Science doesn't get enough emphasis.<br />
<br />
Let me mention two recent examples in that History of Computing category. <a href="http://www.imdb.com/title/tt2084970/">The Imitation Game</a> give a great, though slightly fictionalized, portrait of the computing and computer science pioneer Alan Turing focusing on his time at Bletchley Park breaking the Enigma code. Walter Isaacson, whose histories of Franklin, Einstein and Jobs I thoroughly enjoyed, writes <a href="http://www.amazon.com/gp/product/147670869X/ref=as_li_tl?ie=UTF8&camp=1789&creative=390957&creativeASIN=147670869X&linkCode=as2&tag=computation09-20&linkId=VVJ6OETDXBNQXEKO">The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution</a> which tells the stories of computers from Ada Lovelace to Google (oddly stopping before social networks).<br />
<br />
But what can we do about the History of Computer Science, particularly for theoretical computer science? We live in a relatively young field where most of the great early researchers still roam among us. We should take this opportunity to learn and record how our field developed. I've dabbled a bit myself, talking to several of the pioneers, writing (with Steve Homer) a <a href="http://people.cs.uchicago.edu/~fortnow/papers/history.pdf">short history</a> of computational complexity in the 20<sup>th</sup> Century and a history chapter in The Golden Ticket.<br />
<br />
But I'm not a historian. How do we collect the stories and memories of the founders of the field and tell their tales while we still have a chance?Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com5tag:blogger.com,1999:blog-3722233.post-10824654589019138582015-01-05T14:38:00.000-06:002015-01-06T10:21:53.686-06:00Why do students do this?Before my midterm in ugrad Theory of Computation I gave the students a sheet of practice problems to do that I would go over before the midterm.<br />
<br />
One of them was: Let L be in DTIME(T(n)). Give an algorithm for L*. Try to make it efficient. What is the time complexity of your algorithm? (I had done that if L is in P then L^* is in P in class earlier in the term.)<br />
<br />
My intention was that they do the Dynamic Programming solution. Since it wasn't being collected I didn't have to worry about what would happen if they did it by brute force. When I went over it in class I did the Dynamic Programming Solution, which is roughly T(n)^3 time.<br />
<br />
I allow my students to bring in a sheet of notes that they make up themselves.<br />
<br />
On the exam was the problem: Let L_1 \in DTIME(T_1(n)) and L_2\in DTIME(T_2(n)).<br />
Give an algorithm for L_1L_2. What is the time complexity of your algorithm?<br />
<br />
Of my 20 students 5 of them gave me, word for word, the dynamic programming solution to the L, L* problem.<br />
<br />
Why would they do this? Speculations:<br />
<ol>
<li>They just copied it off of their cheat sheet with no understanding.</li>
<li>They wanted pity points (they didn't get any and I told the class that if a similar thing happens on the final I will give them LESS THAN zero on the problem).</li>
<li>They so hoped that the L, L* problem would be on the exam (possibly becuase it was on their cheat sheet) that they misread the problem.</li>
<li>They thought `Dr. Gasarch wouldn't have put it on the practice exam unless it was on the real exam' (or something like that), so they misread it. </li>
</ol>
The five students were not very good (they did poorly on other problems as well, and on the HW), so it was not a matter of good students being confused or getting nervous.<br />
<br />
But I ask-- (1) is this kind of thing common? For my Sophomore Dscrete Math yes, but I was very slightly surprised to see it in my senior course. (2) Do you have examples? I am sure that you do, but my point is NOT to do student-bashing, its to ask WHY they do this. GASARCHhttp://www.blogger.com/profile/06134382469361359081noreply@blogger.com6tag:blogger.com,1999:blog-3722233.post-72807164348226903302014-12-29T12:41:00.000-06:002014-12-29T12:41:38.008-06:002014 Complexity Year in ReviewTheorem of the year goes to <a href="http://arxiv.org/abs/1407.6692">2-Server PIR with Sub-polynomial Communication</a> by Zeev Dvir and Sivakanth Gopi. In Private Information Retrieval you want to access information copied in multiple databases in a way so that no database knows what question you asked. In 1995 Chor, Kushilevits, Goldreich and Sudan <a href="http://dx.doi.org/10.1145/293347.293350">showed</a> how to use n<sup>1/3</sup> bits of communication with two databases. Razborov and Yekhanin <a href="http://dx.doi.org/10.1109/FOCS.2006.10">proved</a> that using current techniques the bound could not be broken. Dvir and Gopi<br />
developed new techniques to break that barrier using n<sup>O(√(log log n/log n))</sup> bits with two databases, less than n<sup>δ</sup> for any δ. Bill <a href="http://blog.computationalcomplexity.org/2014/08/the-n13-barrier-for-2-server-pirs.html">posted</a> more on this result back in August.<br />
<br />
And of course lots of other great work on extended formulations, circuits, algorithms, communication complexity and many other topics. We also had another round of <a href="http://blog.computationalcomplexity.org/2014/12/favorite-theorems-recap.html">favorite theorems</a> for the past decade.<br />
<br />
2014 will go down as the year computer science exploded. With a big convergence of machine learning/big data, the connectedness of everything, the sharing economy, automation and the move to mobile, we have a great demand for computer scientists and a great demand from students to become computer scientists or have enough computing education to succeed in whatever job they get. Enrollments are booming, CS departments are hiring and demand far outstrips the supply. A great time for computer science and a challenging one as well.<br />
<br />
We say goodbye to <a href="http://dmatheorynet.blogspot.com/2014/04/gm-adelson-velsky-passed-away.html">G.M. Adelson-Velsky</a>, <a href="http://bertoni.di.unimi.it/InMemoriam.html">Alberto Bertoni</a>, <a href="http://news.usc.edu/#!/article/59206/in-memoriam-edward-blum-90/">Ed Blum</a>, <a href="http://en.wikipedia.org/wiki/Ashok_K._Chandra">Ashok Chandra</a>, <a href="http://www.clrc.rhul.ac.uk/people/chervonenkis/">Alexey Chervonenkis</a>, <a href="http://www.news.cornell.edu/stories/2014/11/mathematician-eugene-dynkin-dies-90">Eugene Dynkin</a>, <a href="http://cs.illinois.edu/news/memory-clarence-ellis-1943-2014">Clarence "Skip" Ellis</a>, <a href="http://www.nytimes.com/2014/11/16/world/europe/alexander-grothendieck-math-enigma-dies-at-86.html">Alexander Grothendieck</a>, <a href="http://ferranhurtado.blogspot.com.es/">Ferran Hurtado</a>, <a href="http://www.cc.gatech.edu/news/georgia-tech-remembers-mike-stilman">Mike Stilman</a>, <a href="http://www.ae-info.org/ae/User/Stojmenovic_Ivan/OtherInformation">Ivan Stojmenovic</a>, <a href="https://agtb.wordpress.com/2014/06/30/berthold-vocking-1967-2014/">Berthold Vöcking</a>, <a href="http://rjlipton.wordpress.com/2014/07/01/remembering-ann-yasuhara/">Ann Yasuhara</a>, <a href="http://windowsontheory.org/2014/09/19/farewell-microsoft-research-silicon-valley-lab/">Microsoft Research-Silicon Valley</a> and <a href="http://en.chessbase.com/post/nyt-chess-column-dies-but-could-be-resurrected">The New York Times Chess Column</a>.<br />
<div>
<br /></div>
<div>
Thanks to our contributors <a href="http://blog.computationalcomplexity.org/2014/12/joint-center-for-quantum-information.html">Andrew Childs</a>, <a href="http://blog.computationalcomplexity.org/2014/11/guest-post-about-barbie-i-can-be.html">Brittany Terese Fasy</a> and <a href="http://blog.computationalcomplexity.org/2014/10/guest-post-by-dr-hajiaghayi-new-way-to.html">MohammadTaghi Hajiaghayi</a>.</div>
<div>
<br /></div>
<div>
Looking ahead 2015 brings the centenary of the man we know for balls and distance and the fiftieth anniversary of the paper that brought us the title of this blog. Have a great New Years and remember, in a complex world best to keep it simple.</div>
Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com7tag:blogger.com,1999:blog-3722233.post-67384087722859873642014-12-22T16:33:00.001-06:002014-12-22T16:33:19.828-06:00Undergraduate ResearchI just received the Cornell Math Matters, dedicated to the memory of <a href="http://en.wikipedia.org/wiki/Eugene_Dynkin">Eugene Dynkin</a> who passed away on November 14 at the age of 90. In my freshman year at Cornell, Dynkin recruited me into his undergraduate research seminar building on the success he had with a similar seminar he ran when in Russia. I didn't last long, making the conscious choice not to work on research as an undergrad but to focus on enjoying the college experience. I missed out on a great opportunity but I don't regret that decision.<br />
<br />
Reluctantly I wouldn't give that advice to today's undergrads. Getting into a good graduate program has become much more competitive and even a small amount of research experience may make a large difference in your application. I encourage any undergrad who may consider a PhD in their future to talk to some professors and get started in a research program. But don't let it run your life, make sure you enjoy your time at college. You'll have plenty of time to spend every waking moment on research once you start graduate school.Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com4tag:blogger.com,1999:blog-3722233.post-63755423551645835252014-12-18T06:44:00.000-06:002014-12-18T13:14:52.554-06:00The NIPS ExperimentThe <a href="http://nips.cc/Conferences/2014/">NIPS</a> (machine learning) conference ran an interesting experiment this year. They had two separate and disjoint program committees with the submissions split between them. 10% (166) of the submissions were given to both committees. If either committee accepted one of those papers it was accepted to NIPS.<br />
<br />
According to an <a href="http://mrtz.org/blog/the-nips-experiment/">analysis</a> by Eric Price, of those 166, about 16 (about 10%) were accepted by both committees, 43 (26%) by exactly one of the committees and 107 (64%) rejected by both committees. Price notes that of the accepted papers, over half (57%) of them would not have been accepted with a different PC. On the flip side 83% of the rejected papers would still be rejected. More details of the experiment <a href="http://inverseprobability.com/2014/12/16/the-nips-experiment/">here</a>.<br />
<br />
No one who has ever served on a program committee should be surprised by these results. Nor is there anything really wrong or bad going on here. A PC will almost always accept the great papers and almost always reject the mediocre ones, but the middle ground are at a similar quality level and personal tastes come into play. There is no objective perfect ordering of the papers and that's why we task a program committee to make those tough choices. The only completely fair committees would either accept all the papers or reject all the papers.<br />
<br />
These results can lead to a false sense of self worth. If your paper is accepted you might think you had a great submission, more likely you had a good submission and got lucky. If your paper was rejected, you might think you had a good submission and was unlucky, more likely you had a mediocre paper that would never get in.<br />
<br />
In the few days since NIPS announced these results, I've already seen people try to use them not only to trash program committees but for many other subjective decision making. In the end we have to make choices on who to hire, who to promote and who to give grants. We need to make subjective decisions and those done by our peers aren't always consistent but they work much better than the alternatives. Even the machine learning conference doesn't use machine learning to choose which papers to accept.Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com12tag:blogger.com,1999:blog-3722233.post-68731554247891076012014-12-15T11:42:00.002-06:002014-12-15T12:11:36.006-06:00Joint Center for Quantum Information and Computer Science(Guest post by Andrew Childs who is now at the Univ of MD at College Park)<br />
<br />
<br />
<br />
We have recently <a href="http://cmns.umd.edu/news-events/features/2563">launched</a> a new <a href="http://quics.umd.edu">Joint Center for Quantum Information and Computer Science</a> (QuICS) at the University of Maryland. This center is a partnership<br />
with the National Institute of Standards and Technology, with the support and participation of the Research Directorate of the National Security Agency/Central Security Service. QuICS will foster research on quantum information and computation.<br />
<br />
We are pleased to announce opportunities for <a href="http://quics.umd.edu/postdocs">Hartree Postdoctoral Fellowships</a> <br />
(deadline: December 30, 2014) and <a href="https://quics.umd.edu/graduate-students">Lanczos Graduate Fellowships</a>. Outstanding postdoctoral and graduate researchers with an interest in quantum information processing are encouraged to apply.<br />
<br />
QuICS complements a strong program in quantum physics research at the <a href="http://jqi.umd.edu">Joint Quantum Institute</a>. Maryland is also home to a new <a href="http://umdrightnow.umd.edu/news/lockheed-martin-umd-partner-quantum-computing">Quantum Engineering Center</a>. <br />
It's an exciting time for quantum information here.<br />
GASARCHhttp://www.blogger.com/profile/06134382469361359081noreply@blogger.com0tag:blogger.com,1999:blog-3722233.post-1220725433551791112014-12-11T07:22:00.000-06:002014-12-11T07:22:58.162-06:00The Ethics of Saving LanguagesThe linguist John McWhorter wrote an NYT opinion piece entitled <a href="http://www.nytimes.com/2014/12/07/opinion/sunday/why-save-a-language.html">Why Save a Language?</a> where he argues why we should care about saving dying languages, basically that language gives us a window into culture. As a computer scientist I appreciate the scientific value of studying languages but perhaps the question is not whether we should care but is it ethical to save languages?<br />
<br />
Languages developed on geographical and geopolitical boundaries. Even as new methods of communication came along such as postal mail, the printing press, the telephone and television there never was a strong reason to learn multiple languages save for some small European nations and some professions such as academics and politics.<br />
<br />
Then came the Internet and location mattered less but language barriers still persist. I've certainly noticed a marked increase in the number of young people around the world who know basic conversational English, much from the content they consume online. There's also a sizable amount of content in all the major languages.<br />
<br />
But if you speak a small language where all the other native speakers are geographically very close to you, you lose this networked connection to the rest of humanity. Your only hope is to learn a second language and that second language might become a first language and so many of these small languages start to disappear.<br />
<br />
I understand the desire of linguists and social scientists to want to keep these languages active, but to do so may make it harder for them to take advantage of our networked society. Linguists should study languages but they shouldn't interfere with the natural progression. Every time a language dies, the world gets more connected and that's not a bad thing.Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com11tag:blogger.com,1999:blog-3722233.post-20981636860699471672014-12-09T10:38:00.000-06:002014-12-09T10:38:11.338-06:00Godel and Knuth Prize Nominations due soon. Which would you rather win? Or ...(Alg Decision Theory conference in Kentucky: <a href="http://cs.uky.edu/~srsa224/ADT2015/">here.</a>)<br />
<br />
Knuth Prize Nominations are due Jan 20, 2015.<br />
For info on the prize see <a href="http://www.sigact.org/Prizes/Knuth/">here</a>, if you want to nominate someone<br />
go <a href="http://www.sigact.org/Prizes/Knuth/knuth15.pdf">here.</a><br />
<br />
Godel Prize Nominations are due Jan 31, 2015.<br />
For info on the prize see <a href="http://www.sigact.org/Prizes/Godel/">here</a>, if you want to nominate someone<br />
go <a href="http://www.sigact.org/Prizes/Godel/goedel_call15.pdf">here</a><br />
<br />
Would you rather:<br />
<br />
<ol>
<li>Win a Godel Prize</li>
<li>Win a Knuth Prize</li>
<li>Have a prize named after you when you are dead</li>
<li>Have a prize named after you when you are alive</li>
</ol>
I pick 4; however, I doubt I'll have of 1,2,3,4 happen to me.<br />
<br />
How about you?<br />
<br />
GASARCHhttp://www.blogger.com/profile/06134382469361359081noreply@blogger.com4tag:blogger.com,1999:blog-3722233.post-74090557078947701702014-12-04T07:23:00.000-06:002014-12-04T08:10:42.499-06:00Favorite Theorems RecapWe've now completed five decades of favorite theorems.<br />
<ul>
<li><a href="http://blog.computationalcomplexity.org/2005/12/favorite-theorems-first-decade-recap.html">1965-1974</a></li>
<li><a href="http://blog.computationalcomplexity.org/2006/12/favorite-theorems-second-decade-recap.html">1975-1984</a></li>
<li><a href="http://dx.doi.org/10.1007/3-540-58715-2_130">1985-1994</a> (<a href="http://www.cs.uchicago.edu/~fortnow/papers/topten.pdf">PDF</a>)</li>
<li><a href="http://blog.computationalcomplexity.org/2004/12/favorite-theorems-recap.html">1995-2004</a></li>
</ul>
<div>
And to recap the ten we chose this year from 2005-2014</div>
<div>
<ul>
<li><a href="http://blog.computationalcomplexity.org/2014/02/favorite-theorems-connecting-in-log.html">Undirected Connectivity in Log Space</a></li>
<li><a href="http://blog.computationalcomplexity.org/2014/03/favorite-theorems-unique-games.html">Optimal Inapproximability for Max-Cut</a></li>
<li><a href="http://blog.computationalcomplexity.org/2014/04/favorite-theorems-extended-formulations.html">Limitations of linear programming</a></li>
<li><a href="http://blog.computationalcomplexity.org/2014/05/favorite-theorems-equilibrium.html">Complexity of Nash Equilibrium</a></li>
<li><a href="http://blog.computationalcomplexity.org/2014/06/favorite-theorem-pcp-simplified.html">Combinatorial Proof of PCP Theorem</a></li>
<li><a href="http://blog.computationalcomplexity.org/2014/07/favorite-theorems-compressing-local.html">Constructive Proof of the Lovász Local Lemma</a></li>
<li><a href="http://blog.computationalcomplexity.org/2014/08/favorite-theorems-limited-independence.html">Polylogarithmic independence fools AC<sup>0</sup></a></li>
<li><a href="http://blog.computationalcomplexity.org/2014/09/favorite-theorems-quantum-interactive.html">QIP = PSPACE</a></li>
<li><a href="http://blog.computationalcomplexity.org/2014/10/favorite-theorems-multilinear-circuits.html">Lower Bounds on Multilinear Formulas</a></li>
<li><a href="http://blog.computationalcomplexity.org/2014/11/favorite-theorems-circuit-lower-bounds.html">NEXP not in ACC<sup>0</sup></a></li>
</ul>
<div>
Ten more in ten years. </div>
</div>
Lance Fortnowhttps://plus.google.com/101693130490639305932noreply@blogger.com3tag:blogger.com,1999:blog-3722233.post-85710673825897044442014-12-01T13:33:00.001-06:002014-12-01T13:33:27.276-06:00Cliques are nasty but Cliques are nastierBILL: Today we will show that finding large cliques is likely a nasty problem<br />
<br />
STUDENT: Yes! In my High School the Cliques were usually at most six people and they gossiped about everyone else. They were very nasty.<br />
<br />
BILL: Um, yes, picture in your school that everyone is a node in a graph and that two nodes are connected if they are friends. In your school a clique of size 6 would be 6 people who all liked each other<br />
<br />
STUDENT: Actually, the people in a clique secretly hated each other and sometimes belonged to other cliques that would gossip about people in the first clique.<br />
<br />
BILL: We might need the Hypergraph version to model your school.<br />
<br />
<br />
Computer Scientists and Graph Theorists call a set of nodes that are all connected to each other a CLIQUE - pronounced CLEEK<br />
<br />
High School Kids call a group of people who hang out together a CLIQUE- pronounced CLICK.<br />
<br />
Which term came first? Why are they pronounced differently when they are quite related to each other? Do the members of a high school clique really hate each other?GASARCHhttp://www.blogger.com/profile/06134382469361359081noreply@blogger.com12