I'm sure many of you long-time readers are asking, "Why all this big focus on machine learning in your posts and tweets? You are the 'Computational Complexity' blog! You've barely said a word about meta-complexity."
So what is meta-complexity? From what I can tell the term goes back a few years but really came into wide use in computational complexity in the past year. The Computational Complexity Conference held an invited talk on meta-complexity by Rahul Santhanam, and the Simons Institute is hosting a research program this spring on the topic.
As the name suggests, meta-complexity studies the complexity of computing the complexity of various problems. It's a term that encompasses recent research into the Minimum Circuit Value Problem (given the truth-table of a Boolean function, find the size of the smallest circuit that computes it) and the complexity of time-bounded Kolmogorov complexity.
To quote from the Simons page
Meta-complexity provides a unifying framework for a variety of important tasks in several important areas of computer science, including computational complexity, proof complexity, cryptography, and learning theory. These areas are all intimately linked, but only recently are these links being made explicit and studied more closely. For example, learning can be interpreted as solving search versions of the Minimum Circuit Size Problem and related problems. Basing primitives such as one-way functions and indistinguishability obfuscation on standard complexity assumptions is one of the main objectives in theoretical cryptography. Important recent directions involving meta-complexity within proof complexity, such as lifting and automatability, strengthen analogies and connections between proof complexity and circuit complexity. In addition, independence results such as the natural proofs framework have intuitive interpretations in terms of meta-complexity. These connections have led to several recent breakthroughs, including quasi-polynomial time PAC-learning algorithms for constant-depth circuits with parity gates, new worst-case to average-case reductions for NP problems, a new complexity-theoretic characterization of one-way functions, and the NP-hardness of automating resolution.
When you get down to it meta-complexity is all about learning, determining the complexity of finding programs. You cannot escape it.
To dive deeper into meta-complexity check out the videos of the Simons meta-complexity bootcamp.