R, the set of Kolmogorov-random strings, is a central notion in the study of algorithmic information theory, and in recent years R has increasingly been studied in relation to computational complexity theory. This talk takes as its starting point three strange inclusions that have been proved since 2002: 1. NEXP is contained in the class of problems NP-Turing-reducible to R. 2. PSPACE is contained in the class of problems poly-time Turing-reducible to R. 3. BPP is contained in the class of problems poly-time truth-table-reducible to R. (These inclusions hold for both of the most widely-studied variants of Kolmogorov complexity: the plain complexity C(x) and the prefix-complexity K(x). They also hold no matter which "universal" Turing machine is used in the definitions of the functions C and K.)
These inclusions are "strange" since R is not even computable! Thus it is not at all clear that these are meaningful upper bounds on the complexity of BPP, PSPACE, and NEXP, and indeed it is not at all clear that it is very interesting to consider efficient reductions to noncomputable sets such as R.
In this talk, I will try to convince you that the class of problems efficiently reducible to R is, indeed, a complexity class. The main theorems are that, if we restrict attention to prefix complexity K and the corresponding set of random strings R_K, then the class of decidable problems that are in NP relative to R_K (no matter which universal machine is used to define K) lies in EXPSPACE, and the class of decidable problems that are poly-time truth-table reducible to R_K (no matter which universal machine is used to define K) lies in PSPACE.
Thus we can "sandwich" PSPACE between the class of problems truth-table-and Turing-reducible to R_K, and the class of decidable problems that are in NP relative to R_K lies between NEXP and EXPSPACE. The corresponding questions for plain Kolmogorov complexity C are wide open; no upper bounds are known at all for the class of decidable problems efficiently reducible to R_C.
These results also provide the first quantitative limits on the applicability of uniform derandomization techniques.
This is joint work with Luke Friedman and William Gasarch.
Eric Allender is a well-known researcher in the field of computational complexity, and has given numerous plenary addresses internationally at symposia on theoretical computer science. He received a B.A. from the University of Iowa in 1979, majoring in Computer Science and Theatre, and a Ph.D. from Georgia Tech in 1985. He has been at Rutgers University since then, serving as department chair from 2006 to 2009. He is a Fellow of the ACM, and serves on the editorial boards of ACM Transactions on Computation Theory, Computational Complexity, and The Chicago Journal of Theoretical Computer Science. He has chaired the Conference Committee for the annual IEEE Conference on Computational Complexity, and he serves on the Scientific Board for the Electronic Colloquium on Computational Complexity (ECCC).