Convergence rate for diminishing stepsize methods in nonconvex constrained optimization via ghost penalties

Authors

DOI:

https://doi.org/10.1478/AAPP.98S2A8

Keywords:

Constrained optimization, nonconvex optimization, diminishing stepsize, convergence rate, iteration complexity

Abstract

This is a companion paper to "Ghost penalties in nonconvex constrained optimization: Diminishing stepsizes and iteration complexity" (to appear in Mathematics of Operations Research). We consider the ghost penalty scheme for nonconvex, constrained optimization introduced in that paper, coupled with a diminishing stepsize procedure. Under an extended Mangasarian-Fromovitz-type constraint qualification we give an expression for the maximum number of iterations needed to achieve a given solution accuracy according to a natural stationarity measure, thus establishing the first result of this kind for a diminishing stepsize method for nonconvex, constrained optimization problems.

Author Biographies

  • Francisco Facchinei, Università di Roma La Sapienza
    Department of Computer Control and Management Engineering Antonio Ruberti
  • Vyacheslav Kungurtsev, Czech Technical University in Prague
    Department of Computer Science, Faculty of Electrical Engineering
  • Lorenzo Lampariello, Università degli Studi Roma Tre
    Department of Business Studies
  • Gesualdo Scutari, Purdue University
    School of Industrial Engineering, West-Lafayette, IN

Downloads

Published

2020-12-13

Issue

Section

Variational Analysis, PDEs and Mathematical Economics (Conference Proceedings)