Vocal Optimization Conference: Advanced Algorithms
June 10-12, 2026, Mosonmagyaróvár, Hungary

Confirmed Plenary Invited Speakers Include


Georgina Hall (Assistant Professor and the Patrick and Valentine Firmenich Fellow in Business and Society at INSEAD, France)

Short bio: Georgina Hall is an Assistant Professor at INSEAD in the Decision Sciences Area. Her research focuses on convex relaxations of NP-hard problems, particularly those that arise in polynomial optimization and problems on graphs. Prior to joining INSEAD in 2019, she was a postdoctoral student at INRIA. She completed her PhD in Operations Research and Financial Engineering at Princeton University in 2018. She is the recipient of the 2018 INFORMS Optimization Society Young Researcher's Prize and the 2020 Information Theory Society Paper Award, among other awards.

Title: Enforcing shape constraints using sum of squares polynomials

Abstract: Many of the most useful priors in optimization and data science – such as submodularity in discrete settings, or convexity and bounded derivatives in continuous settings – share a common challenge: While they are ubiquitous in applications, testing whether a function satisfies these properties, or optimizing over functions with these constraints, is computationally intractable. In this talk, we develop a unifying semidefinite programming-based perspective in which such properties are certified using sum of squares (sos) polynomials.

In the first part of the talk, we introduce the notion of t-sos submodularity, a hierarchy indexed by t of sufficient algebraic conditions for certifying submodularity of set functions. For fixed t, each level of the hierarchy can be verified via a semidefinite program of size polynomial in the ground set. We develop algebraic characterizations of t-sos-submodular functions, study when the hierarchy exactly coincides with submodularity, and demonstrate applications to submodular regression, difference-of-submodular programming, and approximate submodular maximization.

In the second part of the talk, we revisit sos convexity, a sufficient algebraic condition for convexity. We then focus on the problem of convex regression, which involves fitting a convex (multivariate) polynomial to noisy evaluations of an unknown convex function. For this problem, we study a set of sos-convex polynomial regressors defined via a hierarchy of semidefinite programs and show that these regressors are consistent estimators of the underlying convex function. As a byproduct, we prove that sos-convex polynomials are dense in the set of polynomials convex over a box. We also provide a detailed comparison of our regressor against existing approaches for convex regression and identify settings in which the sos framework is particularly effective.

This presentation is based on joint work with Mihaela Curmei and Anna Deza.

Sara Shashaani (Associate Professor and Bowman Faculty Scholar, Fitts Dept. of Industrial and Systems Engineering and Operations Research, North Carolina State University, USA)

Short bio: Sara Shashaani is an Associate Professor and Bowman Faculty Scholar in the Fitts Department of Industrial and Systems Engineering at North Carolina State University. Her research interests lie in the intersection of stochastic optimization and Monte Carlo simulation focusing on zeroth-order nonconvex problems and digital twin methodology. She is a 2024 Goodnight Innovator, a 2025 MGB-SIAM Early Career Fellow, and a recipient of multiple research grants from the National Science Foundation and the Office of Naval Research. She was an elected board member of the INFORMS Simulation Society and is a co-creator of SimOpt: an open-source simulation optimization library. Her research has contributed to renewable energy, climate adaptation, advanced manufacturing, and public health application areas.

Ferenc Friedler (Professor, Rector and Vice President for Academic Affairs of the Széchenyi István University, Hungary)

Short bio: Ferenc Friedler's research interests focus on mathematical modeling, optimization, and their diverse engineering applications. He is best known for developing the P-graph and S-graph methodologies. The P-graph framework was originally established to solve network synthesis problems, which are fundamental to the efficiency of production processes. Since its inception, its application has expanded into numerous fields, including the design of engneering systems (such as optimal safety systems and fault diagnosis of processor arrays), the investigation of chemical reaction mechanisms, supply chain optimization, and the study of metabolic and energy networks. The S-graph methodology provides a specialized representation and efficient algorithms for the optimal scheduling of batch processes. The further extension of both methodologies remains a primary focus of his research.

His professional recognition is highlighted by a Vaaler Award (New York, USA, 1997) for software excellence. His research results have been integrated into core academic curricula in the United States, appearing in a standard textbook Plant Design and Economics for Chemical Engineers (Peters et al., McGraw Hill, 2003). A comprehensive and unified description of the P-graph framework is available in the book P-graphs for Process Systems Engineering: Mathematical Models and Algorithms (Friedler, Orosz, and Pimentel; Springer, 2022).

Co-founder of the VOCAL conference series.

Tamás Terlaky (Alcoa Endowed Chair Professor. Quantum Computing Optimization Laboratory, Department of Industrial and Systems Engineering Lehigh University, Bethlehem, PA, USA)

Short bio: Dr. Terlaky has published four books, edited over ten books and journal special issues and published over 230 research papers. Topics include theoretical and algorithmic foundations of mathematical optimization; nuclear reactor core reloading, oil refinery, VLSI design, radiation therapy treatment, and inmate assignment optimization; quantum computing.

Dr. Terlaky is Editor-in-Chief of the Journal of Optimization Theory and Applications. He has served as associate editor of ten journals and has served as conference chair, conference organizer, and distinguished invited speaker at conferences all over the world. He founded, and co-founded the EUROPT, MOPTA, ICCOPT, IOS, and the VOCAL conference series. He was general Chair of the INFORMS 2015 Annual Meeting, a former Chair of INFORMS’ Optimization Society, Chair of the ICCOPT Steering Committee of the Mathematical Optimization Society, Chair of the SIAM AG Optimization, and Vice President of INFORMS. He received the MITACS Mentorship Award; Award of Merit of the Canadian Operational Research Society, Egerváry Award of the Hungarian Operations Research Society, H.G. Wagner Prize of INFORMS, Outstanding Innovation in Service Science Engineering Award of IISE. He is Fellow of INFORMS, SIAM, IFORS, The Fields Institute, and elected Fellow of the Canadian Academy of Engineering. He was Plenary Speaker at numerous conferences, including the 2024 ISMP in Montreal.

Title: On the 70+ Years of Interior Point Methods (IPMs)

Abstract: In 1954, Ragnar Frish proposed a “Logarithmic Potential Method” to solve Linear Optimization (LO) problems, and he also extended the methodology to solve convex optimization problems. In the 1960’s Fiacco and McCormick studied and expanded the logarithmic barrier methodology as SUMT: Sequential Unconstrained Minimization Technique. The modern age of polynomial time IPMs has been launched by Karmarkar’s 1984 paper. In the past four decades IPMs transformed the way we think of optimization; expanded the scope of efficiently solvable optimization problems from linear to smooth-convex and conic LO. The powerful methodology of IPMs impacted most areas of optimization, including general nonlinear and combinatorial optimization, and most recently emerged as key methodology in quantum computing optimization. This talk reviews the major milestones of the 7 decades of the Interior Point Revolution.