Complexity theorists studied worst-case complexity until the mid-80's and then have focused on average-case complexity since.I asked him why he though that and he said that he had given an earlier talk and said complexity theorists focused only on worst-case complexity and was told this wasn't true since the mid-80's.
What did happen in the mid-80's? Levin came up with his definition of average-case complexity classes. Modern cryptography started to really grow around then with their emphasis on hard on average assumptions. Results like Nisan-Wigderson connected average-case hardness with derandomization.
But most of the great work in complexity continued to focus on worst-case complexity: Circuit lower bounds, nondeterministic space closed under complement, Toda's theorem, interactive proofs, PCPs, hardness of approximation results, Primes in P, SL=L and much more. Even derandomization results are now based on worst-case hardness.
Why the focus on worst instead of average case? We can't know the underlying distributions for sure and worst case complexity handles any distribution.
In crypto they know the underlying distribution since their protocols create it. But now even in that community you now see a slight move to base protocols on worst-case assumptions, using average-to-worst case reductions for lattice-based problems. I worry though that these average-to-worst case reductions imply less that the average case complexity of these problems are harder than we expected and more that the worst case is easier than we thought.