Access Type

Open Access Dissertation

Date of Award

January 2025

Degree Type

Dissertation

Degree Name

Ph.D.

Department

Mathematics

First Advisor

Boris Mordukhovich

Abstract

This thesis focuses on the design and convergence analysis of algorithms for solving nonconvex optimization problems under inexact first-order information. We introduce Inexact Reduced Gradient (IRG) methods for general smooth functions and Inexact Gradient Descent (IGD) methods for $\mathcal{C}^{1,1}_L$ functions with relative and absolute errors. Additionally, we develop Inexact Proximal Point and Inexact Proximal Gradient methods for weakly convex functions. Our methods improve the performance of standard inexact proximal point methods, inexact proximal gradient methods, and inexact augmented Lagrangian methods by approximately 2.5 to 10 times in terms of iteration complexity for image processing tasks. Moreover, we propose new derivative-free optimization methods for smooth functions, addressing both noiseless and noisy settings. Our derivative-free methods demonstrate greater stability than standard finite-difference-based methods with fixed intervals, the implicit filtering algorithm, and the random gradient-free algorithm when handling small noise. They also outperform production-ready solvers such as Powell, L-BFGS-B, and COBYLA from the SciPy library in highly noisy problems. Finally, we highlight the crucial role of rigorous convergence analysis in improving the training and generalization of deep neural networks for image classification tasks.

Share

COinS