Abstract: Relative to the large literature on upper bounds on complexity of convex optimization, lesser attention has been paid to the fundamental hardness of these problems. Given the extensive use of convex optimization in machine learning and statistics, gaining an understanding of these complexity-theoretic issues is important. In this paper, we study the complexity of stochastic convex optimization in an oracle model of computation. We introduce a new notion of discrepancy between functions, and use it to reduce problems of stochastic convex optimization to statistical parameter estimation, which can be lower bounded using information-theoretic methods. Using this approach, we improve upon known results and obtain tight minimax complexity estimates for various function classes.
Download: pdf
Citation
- Information-theoretic lower bounds on the oracle complexity of convex optimization (pdf, software)
A. Agarwal, P. Bartlett, P. Ravikumar, M. Wainwright.
IEEE Transactions on Information Theory 58(5), pp. 3235-3249, 2012.
Bibtex: