Abstract: Nonnegative Matrix Approximation is an effective matrix decomposition technique that has proven to be useful for a wide variety of applications ranging from document analysis and image processing to bioinformatics. There exist quite a few algorithms for nonnegative matrix approximation (NNMA), for example, Lee & Seung’s multiplicative updates, alternating least squares, and gradient descent based procedures. However, most of these procedures suffer from either slow convergence, numerical instability, or at worst, serious theoretical drawbacks. In this paper we present new and improved algorithms for the least-squares NNMA problem, which are theoretically well-founded and overcome many of the deficiencies of other methods. In particular, we use non-diagonal gradient scaling to obtain Newton-type methods with rapid convergence. Our methods provide numerical results superior to both Lee & Seung’s method as well as to the alternating least squares (ALS) heuristic, which is known to work well in some situations but has no theoretical guarantees (Berry et al. 2006). Our approach extends naturally to include regularization and box-constraints, without sacrificing convergence guarantees. We present experimental results on both synthetic and real-world datasets that demonstrate the superiority of our methods, in terms of better approximations as well as efficiency.
- Fast Newton-type Methods for the Least Squares Nonnegative Matrix Approximation Problem (pdf)
D. Kim, S. Sra, I. Dhillon.
In SIAM International Conference on Data Mining (SDM), pp. 343-354, April 2007.