In this mini symposium, we address the theoretical worst-case complexity bounds and practical performance of continuous optimization methods of different order. We consider algorithms for solving both convex and non-convex minimization problems, as well as more general min-max and variational inequalities problems. Our speakers will present their works covering a broad spectrum of applications including safe learning, derivative-free optimization, second-order optimization, and also adaptive, stochastic, and decentralized learning
Mini symposium organizers:
Nikita Doikov (École Polytechnique Fédérale de Lausanne)
Geovani Nunes Grapiglia (Université Catholique de Louvain)

Session 3. Room A6, Wednesday 17:30-19:30.
Chair: Nikita Doikov (École Polytechnique Fédérale de Lausanne)
Speakers:
Mihai I. Florea (UCLouvain) Adaptive first-order methods with enhanced worst-case rates
Geovani Nunes Grapiglia (Université Catholique de Louvain) First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians
Ilnura Usmanova (Paul Scherrer Institute) Safe Primal-Dual Optimization with a Single Smooth Constraint
Dânâ Davar (Université Catholique de Louvain) A derivative-free trust-region method based on finite-differences for composite nonsmooth optimization

Session 5. Room A6, Thursday 17:30-19:30.
Chair: Geovani Nunes Grapiglia (Université Catholique de Louvain)
Speakers:
Nikita Doikov (École Polytechnique Fédérale de Lausanne) Polynomial Preconditioning for Gradient Methods
Anton Rodomanov (CISPA (Germany)) Universal Gradient Methods for Stochastic Convex Optimization
Sebastian U. Stich (CISPA (Germany)) Non-convex Stochastic Composite Optimization with Polyak Momentum
Pavel Dvurechensky (WIAS (Germany)) Decentralized Local Stochastic Extra-Gradient for Variational Inequalities