The goal of compressed sensing is make use of image structure to estimate an image from a small number of linear measurements. The structure is typically represented by sparsity in a well-chosen basis.We show how to achieve guarantees similar to standard compressed sensing but without employing sparsity at all -- instead, we suppose that vectors lie near the range of a generative model G: R^k -> R^n.
Our main theorem is that, if G is L-Lipschitz, then roughly O(k log L) random Gaussian measurements suffice for an L_2/L_2 r ecovery guarantee; this is O(k d log n) for typical d-layer networks. We demonstrate our results using generative models from published variational autoencoder and generative adversarial networks. Our method can use 5-10x fewer measurements than Lasso for the same accuracy.This joint work with Ashish Bora, Ajil Jalal, and Alex Dimakis will appear at ICML 2017.
Eric Price is an assistant professor in the Department of Computer Science at UT Austin. He received a Ph.D. in computer science from MIT in 2013 under the supervision of Piotr Indyk. Eric's research was featured in Technology Review's TR10 list of 10 breakthrough technologies of 2012, and his thesis received a George M. Sprowls award for best doctoral thesis in computer scie nce at MIT. His research is on sublinear algorithms, where one solves problems without storing the entire input.