Zum Hauptinhalt springen Zur Suche springen Zur Hauptnavigation springen

A Coordinate Gradient Descent Method for Structured Nonsmooth Optimization

Sangwoon Yun
Nonsmooth optimization problems are generally considered to be more difficult than smooth problems. Yet, there is an important class of nonsmooth problems that lie in between. In this book, we consider the problem of minimizing the sum of a smooth function and a (block separable) convex function with or without linear constraints. This problem includes as special cases bound-constrained optimization, smooth optimization with L_1-regularization, and linearly constrained smooth optimization such as a large-scale quadratic programming problem arising in the training of support vector machines. We propose a block coordinate gradient descent method for solving this class of structured nonsmooth problems. The method is simple, highly parallelizable, and suited for large-scale applications in signal/image denoising, regression, and data mining/classification. We establish global convergence and, under a local Lipschitzian error bound assumption, local linear rate of convergence for this method. Our numerical experiences suggest that our method is effective in practice. This book is helpful to the people who are interested in solving large-scale optimization problems.
Autor: Yun, Sangwoon
EAN: 9783836478601
Sprache: Englisch
Seitenzahl: 112
Produktart: kartoniert, broschiert
Verlag: VDM Verlag Dr. Müller e.K.
Veröffentlichungsdatum: 14.11.2012
Untertitel: Theory and Applications
Größe: 8 × 150 × 220
Gewicht: 185 g