Ir al contenido

Documat


Resumen de Proximal methods for structured group features and correlation matrix nearness

Carlos María Alaiz Gudín

  • Optimization is ubiquitous in real life as many of the strategies followed both by nature and by humans aim to minimize a certain cost, or maximize a certain benefit. More specifically, numerous strategies in engineering are designed according to a minimization problem, although usually the optimization problems tackled are convex with a differentiable objective function, since these problems have no local minima and they can be solved with gradient-based techniques. Nevertheless, many interesting problems are not differentiable, such as, for instance, projection problems or problems based on non-smooth norms. An approach to deal with them can be found in the field of Proximal Methods (PMs), which are based on iterative local minimizations using the Proximity Operator (ProxOp) of the terms that compose the objective function. In this thesis, the state of the art in PMs is thoroughly reviewed, after which a first illustration of the use of these PMs to build sparse linear models for wind energy forecast is given. Two optimization tasks are then addressed from a proximal perspective.

    The first problem comes from the world of Machine Learning, in particular from the field of supervised regression, where regularized models play a prominent role. These models are trained by minimizing an error term plus a regularization term, and thus this paradigm fits nicely in the domain of PMs, as the structure of the problem can be exploited by minimizing alternatively the different expressions that compose the objective function, for example using the Fast Iterative Shrinkage¿Thresholding Algorithm. Following this philosophy, a new regularizer is proposed, the Group Total Variation, which is a group extension of the classical Total Variation regularizer. In order to deal with it, an approach to compute its ProxOp is derived. Moreover, it is shown that this regularizer can be used separately to clean multidimensional signals (such as colour images) or into a linear model (which is named the Group Fused Lasso model) to solve regression problems imposing structure on the coefficients.

    The second one is based on the Nearest Correlation Matrix problem, which consists on finding the correlation matrix which is nearest to the true empirical one. Some variants of this problem introduce weights to adapt the confidence given to each entry of the matrix; with a more general perspective, in this thesis the problem is explored considering uncertainty on the observations, which is formalized as a set of intervals in which the measured matrices lie. Two different variants are defined under this framework: a robust approach called the Robust Nearest Correlation Matrix (which aims to minimize the worst-case scenario) and an exploratory approach, the Exploratory Nearest Correlation Matrix (which focuses on the best-case scenario). Both optimization problems can be solved using the Douglas¿Rachford PM with a suitable splitting of the objective functions.


Fundación Dialnet

Mi Documat