A two-day symposium in honor of Richard J. Lipton's 60th brithday was held April 27--28 in the brilliant new Klaus Advanced Computing Center at Georgia Tech. The wide variety of talks influenced by Dick Lipton's ideas attested his ability to say something deep about many subjects. Deep and still keeping to a dictum of Hilbert featured on one speaker's slides: the simple attracts. An example referenced in many talks was his ("one-paragraph") proof of the self-reducibility of the permanent Wikipedia entry). Another referenced his co-authorship of a March 2008 report to the Georgia Secretary of State with recommendations and advisories on electronic voting.

The first talk by Richard Karp carried the message that problems of the Hitting-Set kind are easier most often in practice than their worst-case NP-complete pedigree leads one to expect. This was supplemented by Neal Young's second-day talk involving Lipton's question, "Is it hard to generate random hard instances of NP-complete problems?" Ravi Kannan spoke on the practicality of finding good approximate equilibria for non-zero-sum games and market situations. Dan Boneh showed the extent to which even certified-sound cryptosystems become vulnerable when keys k are used to encrypt data that overtly contains k, or when there are closed cycles of encodings of multiple keys. He also explained how Lipton's kung-fu wielding of the Chinese Remainder Theorem in attacks on RSA and kin slowed web servers by 8%. Anita Jones surveyed the urban landscape of computer security, and propounded the sequence of system calls made by a program as a signature by which to identify malware.

Michael Rabin described a practical implementation of zero-knowledge protocols for high-stakes auctions, one point being efficiency gains from coding on the arithmetic rather than going all the way down to Yao's oblivious ZK circuit verification. Nisheeth Vishnoi presented a polynomial-time algorithm that distinguishes the "Yes" case of the "Unique Games Conjecture" from a tighter "No" case asserting also that the underlying graph is an expander (paper). This evinces a surprising difference between "Unique Games" and non-unique Constraint Satisfaction Problems, and suggests that if the UGC holds at all, any proof must differ widely from traditional proofs establishing hardness of CSPs.

Erik Winfree's multimedia talk "Is DNA Computing?" demonstrated that whatever one feels about the feasibility of large-scale DNA computers (on which Lipton followed Adleman's first paper with a neater formulation), DNA activity is undeniably computational.

Giovanni diCrescenzo talked on Lipton's idea of storing keys not as small files but parceled among huge hunks of stored data, so that intrusion attempts to recover them leave huge footprints, applying hashing and extractors to implement it. I surveyed the frontiers of super-linear lower bounds, including the Lipton-Tarjan separator theorem's relevance and Dick's part in time-space tradeoffs for SAT, and used Allender-Koucky's observation as a segue to super-polynomial lower bounds. I outlined my position that "Very High Degree" methods are capable of surmounting known barriers and may be necessary.

Dick's longtime friend and associate Richard DeMillo surveyed Lipton's contributions to fundamental problems of software correctness. Wenke Lee focused on how 'bots have opposite behavior and intent to viruses, and how the company Damballa founded by Merrick Furst, him, David Dagon, and Lipton combats botnets.

Parikshit Gopalan presented new ideas on the lower bound frontier of ACC[m] for composite m, and continued a running gag on the power of Chinese remaindering. But Avi Wigderson planted a Monty Python foot on further progress by demonstrating that almost all known complexity-class results preserve themselves under an arithmetical form of relativization, while P vs. NP and most other frontier relations do not. Memo to new graduate students honing their technical abilities by building oracles A separating classes C from D: the ante just got upped to building both A and a low-degree extension A' such that C^A is not contained in D^{A'}, so then proving C contained in D would require "non-algebrizing techniques". And various vice-versas...all of which tighten the rules of equation-solving needed to build A. One sentence of hope remained on Avi's slides actually by Scott Aaronson: methods that go beyond treating (feasibly-constructed) multilinear extensions as black-boxes can possibly evade the new barrier. Jin-Yi Cai closed by presenting a "Holographic" algorithm toolkit of subversive power, whose steep learning curve is helped by its affinity with quantum computation.

On the learning-curve subject, I came away with the impression that although it is often higher for cryptographic protocols than algorithms, more of the former actually get implemented. Of course, Karp has always stood for implementing algorithms, and Neal Young reported on how his beats simplex for its target domain, but I'm just reporting my positive impressions of security applications from the two days. In all it was a rich meeting, with "not too many embarrassing stories" for Dick and lots of energy.

I did not really get the impression that the learning curve is higher for crypto protocols as opposed to other algs. But it does make sense that the former can have more immediate practical impact since security vulnerabilities are of more concern in practice than pathological worst-case examples for running time of an alg.

ReplyDeleteThe crypto protocols themselves may not be more complicated, but defining/understanding the guarantees they provide is certainly more complicated than stating the guarantees for algorithms.

ReplyDelete