Abstract: We consider the problem of online prediction with a linear model. In contrast to existing work in on- line regression, which regularizes based on squared loss or KL-divergence, we regularize using divergences arising from the Burg entropy. We demonstrate regret bounds for our resulting online gradient-descent algorithm; to our knowledge, these are the first online bounds involving Burg entropy. We extend this analysis to the matrix case, where our algorithm employs LogDet-based regularization, and discuss an application to online metric learning. We demonstrate empirically that using Burg entropy for regular- ization is useful in the presence of noisy data.
- Topics:
- Online Learning
Download: pdf
Citation
- Online Linear Regression using Burg Entropy (pdf, software)
P. Jain, B. Kulis, I. Dhillon.
University of Texas Computer Science Technical Report (UTCS Technical Report) TR-07-08, February 2007.
Bibtex: