Key research themes
1. How can variants of Newton's method achieve third-order convergence without requiring higher-order derivatives?
This theme investigates modifications to the classical Newton's method to achieve cubic (third-order) convergence rates while avoiding the computational cost of evaluating second or higher-order derivatives. The interest lies in developing iterative schemes that retain the rapid convergence advantages of higher-order methods but remain practical by only utilizing first-order derivative information. This balance is crucial for solving nonlinear equations efficiently when derivative calculations are expensive or impractical.
2. What explicit convergence rates and Hessian approximation properties can greedy quasi-Newton methods guarantee, and how do they differ from classical approaches?
This theme focuses on recent advances in quasi-Newton methods that improve convergence guarantees through directional selection strategies. Classical quasi-Newton updates typically use differences of successive iterates and provide mainly asymptotic convergence results. Greedy quasi-Newton methods select update directions to maximize progress measures, allowing explicit, non-asymptotic superlinear convergence bounds for both the iterates and the convergence of Hessian approximations. Understanding these improvements informs algorithm design for fast and reliable second-order optimization.
3. How can quasi-Newton strategies be integrated with preconditioning in nonlinear conjugate gradient methods to enhance convergence and stability?
This theme explores the design of matrix-free preconditioners for nonlinear conjugate gradient (NCG) methods using quasi-Newton updates. By approximating inverse Hessians or average Hessian information through secant or secant-like conditions, these quasi-Newton preconditioners improve convergence speed and robustness of NCG schemes. Incorporation of damping techniques addresses practical challenges such as Wolfe line search conditions, especially in highly nonlinear or large-scale optimization problems.