# Summary

We organize a one-day workshop on “Nonlocal Methods for Data Processing and Analysis: Theory, Optimisation, and Applications” on June 4th, 2018 at the Politecnico di Milano. The purpose of this meeting is to give an overview of the state-of-the-art nonlocal models for big data and image analysis. Focus will be given also to the optimization aspects and the numerical issues linked to the actual realization of these models, which renders quite challenging in particular for large data. Several applications of nonlocal methods will further be presented.

Registration

Registration for this workshop is free. Lunch and beverages in coffee breaks will be provided for all participants. Please fill out this short registration form if you are interested to participate.

Accomodation:

Participants, who indicated that they require accommodation, have a room reservation at the Hotel Lombardia. It is conveniently located at 10 minutes walking distance from the Politecnico.

Schedule

The event will be held at the Politecnico di Milano (Italy). Although the actual schedule is not yet finalized the workshop will be on June 4th, 2018 approximately between 9.30am and 5pm such that participants are able to get early to Bologna for the opening of the SIAM Conference on Imaging Science (IS18) on June 5th. The schedule for this workshop is as follows:

09:00 - 09:15 | Welcome address and opening |

09:15 - 10:15 | Nonlinear spectral processing – from image descriptors to solving soliton equations (Guy Gilboa) |

10:15 - 10:45 | Coffee break |

10:45 - 11:45 | Learning regularisers: from shallow to deep regularisers for inverse problems (Carola Schönlieb) |

11:45 - 12:45 | Choose your path wisely: gradient descent in a Bregman distance framework (Martin Benning) |

12:45 - 14:00 | Lunch break |

14:00 - 15:00 | “Cut-Pursuit” Algorithm for Graph-Structured Regularization (Loïc Landrieu) |

15:00 - 15:20 | Coffee break |

15:20 - 16:20 | Nonlocal Data Comparison and Seven Processing and Analysis Applications (Coloma Ballester) |

16:20 - 17:20 | Forward-backward algorithm for inverse problems and machine learning (Silvia Villa) |

The abstract for each talk is listed in the following:

**Coloma Ballester**(Universitat Pompeu Fabra): "Nonlocal Data Comparison and Seven Processing and Analysis Applications"

*Data comparison is a fundamental problem in imaging. It is, for instance, a main ingredient in many applications of medical image analysis, such as registration of multimodal data, population analysis or segmentation. In image processing and computer vision, image or video comparison is at the basis of problems such as (static or moving) object recognition, optical and scene flow, or video analysis and understanding, to mention just a few. The methods to approach these problems use, either explicitly or implicitly, a comparison distance (or similarity measure) among the points in the respective image or video domains. The purpose of this talk will be, first, to give an overview of recent techniques allowing to define a multiscale comparison of images defined on Riemannian manifolds. The image comparison problem will be formulated as the problem of comparing two appropriate local neighborhoods belonging to each image defined on a Riemannian manifold, which in turn can be defined by the data domain with a suitable metric depending on the data. Then, we will show seven applications, including segmentation, inpainting, video analysis and dynamic shape disocclusion.*

**Martin Benning**(University of Cambridge): "Choose your path wisely: gradient descent in a Bregman distance framework"

*We propose an extension of a special form of gradient descent — in the literature known as linearised Bregman iteration — to a larger class of non-convex functionals. We replace the classical (squared) two norm metric in the gradient descent setting with a generalised Bregman distance, based on a proper, convex and lower semi-continuous functional. The proposed algorithm is a generalisation of numerous well-known optimisation methods. Its global convergence is proven for functions that satisfy the Kurdyka-Łojasiewicz property. Examples illustrate that for suitable choices of Bregman distances this method — in contrast to traditional gradient descent — allows iterating along regular solution-paths. The effectiveness of the linearised Bregman iteration in combination with early stopping is illustrated for the application of machine learning in image classification with non-local operators.***Guy Gilboa**(Technion Haifa): "Nonlinear spectral processing – from image descriptors to solving soliton equations"

*In this talk we will discuss some new directions in processing and analyzing signals based on nonlinear eigenvalue problems. First, we will show how spectral total variation can be used to provide excellent pixel descriptors for various applications. We will then discuss more abstract nonlinear eigenvalue problems, which appear also in physics, such as models for solitons. A flow which can numerically solve a broad scope of such problems will be introduced.***Loïc Landrieu**(French National Geographical Institute): "Cut-Pursuit Algorithm for Graph-Structured Regularization"

*We present the recent cut pursuit working-set methods for optimizing functions regularized with graph-structured, regularity-inducing penalties. These approaches are able to computationally exploit the piecewise-constant structure of the solutions to efficiently produce accurate solutions through iterative graph-cuts. These algorithms are suited to both convex and non-convex settings and can handle non-differentiable penalizers with discontinuities and infinite-values. This allows us to use our approach for various applications, beating the state-of-the-art methods for large-scale, ill-posed, ill-conditioned optimization problems by several orders of magnitude.***Carola Schönlieb**(University of Cambridge): "Learning regularisers: from shallow to deep regularisers for inverse problems"

*In this talk we discuss the idea of data-driven regularisers, investigating two parametrisation: total variation type regularisers and deep neural networks. This talk is based on joint works with J. C. De Los Reyes, L. Calatroni, C. Chung, T. Valkonen, S. Lunz and O. Oektem.***Silvia Villa**(Politecnico di Milano): "Forward-backward algorithm for inverse problems and machine learning"

Classical approaches to process and classifying data often reduce to designing and minimizing empirical objective functions. The challenge is two-fold. On the one hand, to incorporate the structural information on the problem, on the other hand, to develop optimization schemes that can exploit such a structure. In this talk, I will present an approach based on the forward-backward algorithm both in the context of machine learning and inverse problems. The focus will be on the interplay between estimation and optimization requirements and guarantees.

Contact:

In case of organizational questions or feedback please send an e-mail to one of the organizers of this NoMADS workshop:

- Luca Calatroni (luca.calatroni@polytechnique.edu)
- Lorenzo Rosasco (lorenzo.rosasco@unige.it)
- Daniel Tenbrinck (daniel.tenbrinck@uni-muenster.de)
- Silvia Villa (silvia.villa@polimi.it)