Gradient sampling algorithm for subsmooth functions


Dimitris Boskos, Jorge Cortés and Sonia Martínez
SIAM Journal on Optimization, under review

Abstract:

This paper considers non-smooth optimization problems where we seek to minimize the pointwise maximum of a continuously parameterized family of functions. Since the objective function is given as the solution to a maximization problem, neither its values nor its gradients are available in closed form, which calls for approximation. Our approach hinges upon extending the so-called gradient sampling algorithm, which approximates the Clarke generalized gradient of the objective function at a point by sampling its derivative at nearby locations. This allows us to select descent directions around points where the function may fail to be differentiable and establish algorithm convergence to a stationary point from any initial condition. Our key contribution is to prove this convergence by alleviating the requirement on continuous differentiability of the objective function on an open set of full measure. We further provide assumptions under which a desired convex subset of the decision space is rendered attractive for the iterates of the algorithm.


File: (ArXiv)


Bib-tex entry:

@article{DB-JC-SM:25-siopt,
author = {D. Boskos and J. Cort\'es and S. Mart{\'\i}nez},
title = {Gradient Sampling algorithms for subsmooth functions},
journal= {SIAM Journal on Optimization},
pages = {},
volume = {},
number = {},
note = {Under review},
year = {2025}
}