**Over the past decades, Linear Programming(LP) has been widely used in different areas and considered as one of the mature technologies in numerical optimization. However, the complexity offered by state-of-the-art algorithms (i.e. interior-point method and primal, dual simplex methods) is still unsatisfactory for problems in machine learning with huge number of variables and constraints. In this paper, we investigate a general LP algorithm based on the combination of Augmented Lagrangian and Coordinate Descent (AL-CD), giving an iteration complexity of O((log(1/eps))^2) with O(nnz(A)) cost per iteration, where nnz(A) is the number of non-zeros in the m×n constraint matrix A, and in practice, one can further reduce cost per iteration to the order of non-zeros in columns (rows) corresponding to the active primal (dual) variables through an active-set strategy. The algorithm thus yields a tractable alternative to standard LP methods for large-scale problems of sparse solutions and nnz(A)**

*Abstract:**Topics:*- Big Data
- Coordinate Descent

### Download: pdf, code

### Citation

- Sparse Linear Programming via Primal and Dual Augmented Coordinate Descent (pdf, software, code)

I. Yen, K. Zhong, C. Hsieh, P. Ravikumar, I. Dhillon.

In*Neural Information Processing Systems (NIPS)*, December 2015.*Bibtex:*