Key research themes
1. How can column generation methods be adapted to efficiently solve large-scale nonlinear and conic optimization problems?
This research area focuses on extending and adapting classical column generation (CG) techniques—originally developed for linear programming—to address nonlinear and conic optimization problems with large-scale, structured variable sets and complicating constraints. The challenge is to maintain computational tractability and solution quality while coping with nonlinear objectives and complex conic constraints such as second-order cone and semidefinite constraints. Efficient pricing schemes, subset selection strategies, and inexact or linearized solution methods are developed and analyzed to improve scalability and practical performance.
2. What advancements enable primal-dual interior point methods to efficiently handle a broader class of exotic cones, including spectral cones and nonsymmetric cones, in conic optimization?
This theme investigates the theoretical and algorithmic extensions that allow primal-dual interior point methods (PDIPMs) to address conic optimization problems involving exotic, nonsymmetric cones beyond the classical symmetric cases (nonnegative orthants, second order cones, PSD cones). Attention is paid to constructing logarithmically homogeneous self-concordant barrier functions (LHSCBs), developing numerically stable barrier oracles (gradients, Hessians, inverse Hessians), and designing general frameworks such as Hypatia.jl that facilitate modular cone definitions. These enable more natural problem formulations, avoid large extended formulations, and extend solver capabilities to spectral cones (root-determinant, matrix monotone derivative cones) and other advanced cones, improving solve efficiency and scalability.
3. How can polyhedral and semidefinite descriptions be leveraged to characterize convex hulls of mixed-integer and polynomial conic sets in optimization?
This line of research explores the convexification and structural characterization of mixed-integer nonlinear sets that involve conic quadratic or polynomial constraints, integral variables, and indicator variables. By providing explicit convex hull descriptions using extended formulations with positive semidefinite and linear constraints, or infinite families of conic inequalities, these works unify and extend relaxations used in mixed-integer quadratic optimization and polynomial optimization with sum of squares constraints. This yields stronger relaxations and compact representations amenable to efficient solution methods with better theoretical guarantees.
4. What are the implications of second-order variational analysis and optimality conditions under constant rank and constraint qualifications in nonlinear conic programming?
This theme deals with deriving necessary and sufficient optimality and stability conditions in nonlinear conic programming (including second-order cone and semidefinite programs) under advanced constraint qualifications such as constant rank and nondegeneracy. Employing second-order variational analytical tools, coderivatives, and set-valued analysis, these works refine classical first-order conditions to provide sharper characterizations, stronger duality statements, and better convergence guarantees for numerical algorithms. Such results bridge the gap between nonlinear programming theory and conic-specific structures, supporting more robust optimization algorithm design.
5. How are strong duality and Lagrangian duality characterized and ensured in convex conic optimization under various feasibility and constraint qualifications?
Research in this theme focuses on formulating and proving strong duality results for general convex conic programs, characterizing the equivalence between strict feasibility, relative interior conditions, closedness of cone sums, and boundedness of optimal solutions. The work explores refined theorems of alternatives and minimal cones to enable strong duality even in absence of Slater-type conditions, shedding light on structure of feasible regions and dual optimal sets. These results deepen theoretical foundations for conic duality, influencing algorithm design and robustness guarantees in convex conic optimization.