A Positive Semidefinite Safe Approximation of Multivariate Distributionally Robust Constraints Determined by Simple Functions [0.03%]
通过简单函数确定的多变量分布鲁棒约束的正定半 definite 安全近似
Jana Dienstbier,Frauke Liers,Jan Rolfes
Jana Dienstbier
Single-level reformulations of (nonconvex) distributionally robust optimization (DRO) problems are often intractable, as they contain semi-infinite dual constraints. Based on such a semi-infinite reformulation, we present a safe approximati...
Generalized Robust Optimization using the Notion of Set-Valued Probability [0.03%]
基于集合估值概率的广义鲁棒优化方法研究
Davide La Torre,Franklin Mendivil,Matteo Rocca
Davide La Torre
We propose a novel concept of robustness grounded in the framework of set-valued probabilities, offering a unified and versatile approach to tackling challenges associated with the statistical estimation of uncertain or unknown probabilitie...
Aris Daniilidis,Carlo Alberto De Bernardi,Enrico Miglierina
Aris Daniilidis
We construct a weakly compact convex subset of ℓ 2 with nonempty interior that has an isolated maximal element, with respect to the lattice order ℓ + 2 . Moreover, the maximal point cannot be supported by any strictly positive...
Marianne Akian,Xavier Allamigeon,Stéphane Gaubert et al.
Marianne Akian et al.
We study the tropical analogue of the notion of polar of a cone, working over the semiring of tropical numbers with signs. We characterize the cones which arise as polars of sets of tropically nonnegative vectors by an invariance property w...
General Perturbation Resilient Dynamic String-Averaging for Inconsistent Problems with Superiorization [0.03%]
具有一致性问题的通用摄动鲁棒动态平均化方法及其在超优化中的应用
Kay Barshad,Yair Censor
Kay Barshad
In this paper we introduce a General Dynamic String-Averaging (GDSA) iterative scheme and investigate its convergence properties in the inconsistent case, that is, when the input operators don't have a common fixed point. The Dynamic String...
On Tractable Convex Relaxations of Standard Quadratic Optimization Problems under Sparsity Constraints [0.03%]
稀疏约束下标准二次优化问题的可解凸松弛方法
Immanuel Bomze,Bo Peng,Yuzhou Qiu et al.
Immanuel Bomze et al.
Standard quadratic optimization problems (StQPs) provide a versatile modelling tool in various applications. In this paper, we consider StQPs with a hard sparsity constraint, referred to as sparse StQPs. We focus on various tractable convex...
Yurii Nesterov
Yurii Nesterov
In this paper, we suggest a new framework for analyzing primal subgradient methods for nonsmooth convex optimization problems. We show that the classical step-size rules, based on normalization of subgradient, or on knowledge of the optimal...
Gradient Descent Provably Escapes Saddle Points in the Training of Shallow ReLU Networks [0.03%]
梯度下降在训练浅层ReLU网络中逃离鞍点的理论证明
Patrick Cheridito,Arnulf Jentzen,Florian Rossmannek
Patrick Cheridito
Dynamical systems theory has recently been applied in optimization to prove that gradient descent algorithms bypass so-called strict saddle points of the loss function. However, in many modern machine learning applications, the required reg...
Isolated Calmness of Perturbation Mappings and Superlinear Convergence of Newton-Type Methods [0.03%]
孤立的扰动映射平静性及牛顿型方法的超线性收敛性
Matúš Benko,Patrick Mehlitz
Matúš Benko
In this paper, we characterize Lipschitzian properties of different multiplier-free and multiplier-dependent perturbation mappings associated with the stationarity system of a so-called generalized nonlinear program popularized by Rockafell...
Geodesic Convexity of the Symmetric Eigenvalue Problem and Convergence of Steepest Descent [0.03%]
对称特征值问题的测地凸性及其最速下降法收敛性分析
Foivos Alimisis,Bart Vandereycken
Foivos Alimisis
We study the convergence of the Riemannian steepest descent algorithm on the Grassmann manifold for minimizing the block version of the Rayleigh quotient of a symmetric matrix. Even though this problem is non-convex in the Euclidean sense a...