- Speaker: Katya Scheinberg, Lehigh University
- Time: April 17, 2014, 3:30pm-5:00pm
- Location: POB 6.304
- Host: Inderjit S. Dhillon
Recently several methods were proposed for sparse optimization which make careful use of second-order information to improve local convergence rates. These methods construct a composite quadratic approximation using Hessian information, optimize this approximation using a first-order method, such as coordinate descent and employ a line search to ensure sufficient descent.
Here we propose a general framework, which includes slightly modified versions of existing algorithms and also a new algorithm, which uses limited memory BFGS Hessian approximations, and provide a global convergence rate analysis in the spirit of proximal gradient methods, which includes analysis of method based on coordinate descent.
We also discuss an efficient and general implementation based on this framework and show some encouraging computational results on problems from machine learning domain.
Dr. Scheinberg is an associate professor in the Industrial and Systems Engineering Department at Lehigh University. Her main research areas are related to developing practical algorithms (and their theoretical analysis) for various problems in continuous optimization, such as convex optimization, derivative free optimization, machine learning, quadratic programming, etc. She published a book in 2008 titled, Introduction to Derivative Free Optimization, which is co-authored with Andrew R. Conn and Luis N. Vicente.
She was born in Moscow, Russia, and earned her undergraduate degree in operations research from the Lomonosov Moscow State University in 1992 and then received her Ph.D. in operations research from Columbia in 1997. She was a Research Staff Member at the IBM T.J. Watson Research Center for over a decade, where she worked on various applied and theoretical problems in optimization.
She is currently the editor of Optima, the MOS newsletter, and an associate editor of SIOPT. From January 2014, Dr. Scheinberg will step down as the Optima editor and take on the role of the Editor-in-Chief of the SIAM-MOS Series on Optimization.