(Note from Bill G: This is one of several posts Nicole will be making from the Business meeting.)
Program committee report: Another congrats is in order. No matter what you think about the distribution of topics (see below), you can't deny that the PC did a great job this year. I was especially impressed by the extensive feedback, both internal and external, on submissions; the comments were thorough and explicitly stated the PCs reasons for their decision. Now for the statistics, well, here's the list that was presented at the business meeting: The number of submissions: 302. The number of accepts: 66. Below is a topic that there were papers submitted on, followed by how many submitted,and how many accepted.
- Algebraic/Numerical Computation (14, 1)
- Algorithmic Game Theory (27, 5)
- Approximation (35, 9)
- Geometric (8, 2)
- Graph (17, 1)
- Randomized (14, 4)
- Streaming (6, 2)
- Misc. (15, 0)
- Combinatorics (2, 0)
- Computational Biology (2, 0)
- Computational Complexity (30, 14)
- Cryptography (20, 7)
- Data Structures (6, 2)
- Geometry (13, 3)
- Learning (7, 0)
- Logic/Proof Complexity (11, 2)
- Parallel/Distributed Computation (15, 1)
- Property Testing (10, 5)
- Quantum Computing/Crypto (25, 6)
- Random Structures (7, 2)
- Misc. (11, 0)
- Out of Scope (6, 0)
- P = NP (1, 0)
A NP=P paper without an algorithm! Surely that's worth celebrating in itself...ReplyDelete
P=NP is an important result; I'm surprised that the PC chose not to accept it, especially with the good job they did on everything else.ReplyDelete
The paper actually gave a randomized algorithm for an NP-complete problem. While RP=NP is still an impressive result, we were put off by the sell job (claiming the submission was about P vs. NP).ReplyDelete
What would it take for an NP=P paper to be reviewed seriously? Is it often the case that papers are not reviewed seriously due to lack of credentials especially for famous unsolved problems?ReplyDelete
Who submitted the P=NP paper? That Sergey Gubin guy?ReplyDelete
Very informative. But I think you meant 66 (instead of 86) papers.ReplyDelete
Oooh, first interpretation! I know, I know: "FOCS is a complexity theory conference with near-50 % acceptance rate where some other suckers have been fooled into submitting their non-core-complexity theory papers, thereby artificially inflating the 'quality' rating of acceptance percentage", right?ReplyDelete
"FOCS is a complexity theory conference with near-50 % acceptance rate where some other suckers have been fooled into submitting their non-core-complexity theory papers, thereby artificially inflating the 'quality' rating of acceptance percentage"ReplyDelete
Funnily enough that was my interpretation of things before I even saw the figures.
This shows that the STOC/FOCS mafia is biased against the miscellaneous and out-of-scope communities.ReplyDelete
It is time for a new conference!
Authors shold submit their best miscellaneous and out-of-scope papers to MISCO 2008
I do like how P=NP is its own category. Does that mean there was a NP=P category, but no one submitted this year?ReplyDelete
It seems *too* focused for a category; why not just have a catch-all "crackpot" category?
It is rather striking that not a single learning theory paper made it to FOCS.ReplyDelete
anonymous #4 and #3 seem to imply that the paper on P=NP was rejected without serious review, solely based on title and authorshipReplyDelete
If this is the case we have a serious problem with the integrity of the review system.
P=NP is right! This is because an assigment for satisfication has a lot of redundancies.ReplyDelete
X + Y + Z only need X =1 or Y =1 or Z=1, we do not need X=1 and Y =1...
Therefore, for reasoning of proposational rule can be simplified, e.g, for distribution for disjuctive rule.
The main error of the previous method is to converge to a solution from a huge of hyperspace.
Any combination with backtrack or backjump is a worse computation. A tree search is essentially a combination !
The algorithmic complexity is 2^k T^3
where k is the maximum size of clause; T is the number of clauses.
So the algorithm can be polynomial but the description of problem can be exponential. However, who can understand a language with exponential description ?
The evidence can be verified from the transformation of SAT into 3SAT. The funny thing is the problem is not changed with its complexity even its description !
No matter 2SAT or 3SAT or SAT, and no matter how many variables, the description, still description!!!
Sergey's method is simple and naive. We can have a clear improvement by modifying a little proposational rules for reasoning without truth values.
The solution consist of the main two points, and is not new according to my observation.
Hope my observation and results can verify P=NP with Sergey.
Sorry, a bug. It should be f(k)T^3.ReplyDelete
all geometric computation can be linearithmic or loglinear.ReplyDelete
My recent research finds that we can build a Voronoi diagram with the d-dimension in nlogn.ReplyDelete
This is really interesting! I suddenly find people only can efficiently address VD in a plane before. For d-dimension, it is N^[d/2].
Finally, I realize that the VD problem and other NP problem belong to the same group of hard problem (not exact).
I also realise that people tend to give an optimal solution by combination. But it is not an arithmetic description about solution. The combination is bad computation in the practical applications.
All techniques are only traditional. But problem should be geometrically described.
I write these here as a trace such that the final result verifies what I say in future.