Paper of the year goes to A Discrete and Bounded Envy-Free Cake Cutting Protocol for Any Number of Agents by Haris Aziz and Simon Mackenzie. Might not seem like a traditional complexity result but cake cutting is a computational process with a desired set of properties and this papers settles a long standing open question. Bill posted about the paper back in June.
Some other nice papers from 2016: Hadamard not so rigid after all and not usable for Valiant's lower-bound program, 2-source extractors from low-entropy sources which get near perfect randomness from two weak sources, query complexity separations, deterministic, randomized and quantum, in two wonderful constructions beautifully presented at STOC, space-efficient subset-sum,and learning algorithms from natural proofs.
2016 will go down as a watershed year for machine learning with AlphaGo, huge advances in translation, self-driving cars and with tools like TensorFlow out in the open we'll truly see learning everywhere. Prediction didn't have such a great year with bad calls on Trump's nomination, Brexit and Trump's election. Machines learning humans still has a way to go.
We thank our guest posters Molly Fortnow, MohammadTaghi HajiAghayi, Samir Khuller and Avi Wigderson.
We remember Erich Bloch, Hartmut Ehrig, Carrie Fisher, Rūsiņš Freivalds, John Glenn, David Johnson, Marvin Minsky, Peter Naur, Seymour Papert, Hillary Putnam, Lloyd Shapley and Boris Trakhtenbrot.
As the surprises of 2016 lead us into an uncertain 2017 let me leave with my usual advice: When in a complex world, best to keep it simple.
Post a Comment