Abstract: Nonnegative matrix approximation (NNMA) is a popular matrix decomposition technique that has proven to be
useful across a diverse variety of fields with applications ranging from document analysis and image processing to bioinformatics
and signal processing. Over the years, several algorithms for NNMA have been proposed, e.g. Lee and Seung’s multiplicative
updates, alternating least squares (ALS), and gradient descent-based procedures. However, most of these procedures suffer from
either slow convergence, numerical instability, or at worst, serious theoretical drawbacks. In this paper, we develop a new
and improved algorithmic framework for the least-squares NNMA problem, which is not only theoretically well-founded, but
also overcomes many deficiencies of other methods. Our framework readily admits powerful optimization techniques and as
concrete realizations we present implementations based on the Newton, BFGS and conjugate gradient methods. Our algorithms
provide numerical results superior to both Lee and Seung’s method as well as to the alternating least squares heuristic, which
was reported to work well in some situations but has no theoretical guarantees [1]. Our approach extends naturally to include
regularization and box-constraints without sacrificing convergence guarantees. We present experimental results on both synthetic
and real-world datasets that demonstrate the superiority of our methods, both in terms of better approximations as well as
computational efficiency.
Download: pdf
Citation
- Fast Projection-Based Methods for the Least Squares Nonnegative Matrix Approximation Problem (pdf, software)
D. Kim, S. Sra, I. Dhillon.
Statistical Analysis and Data Mining 1(1), pp. 38-51, February 2008.
Bibtex: