Department of Mathematics,
University of California San Diego
****************************
Math 278B: Mathematics of Information, Data, and Signals
Lijun Ding
UCSD
On the squared-variable approach for nonlinear (semidefinite) optimization
Abstract:
Consider minx≥ 0 f(x), where the objective function f: ℝ→ ℝ is smooth, and the variable is required to be nonnegative. A naive "squared variable" technique reformulates the problem to minv∈ ℝ f(v2). Note that the new problem is now unconstrained, and many algorithms, e.g., gradient descent, can be applied. In this talk, we discuss the disadvantages of this approach known for decades and the possible surprising fact of equivalence for the two problems in terms of (i) local minimizers and (ii) points satisfying the so-called second-order optimality conditions, which are keys for designing optimization algorithms. We further discuss extensions to the vector case (where the vector variable is required to have all entries nonnegative) and the matrix case (where the matrix variable is required to be a positive semidefinite) and demonstrate such an equivalence continues to hold.
February 14, 2025
11:00 AM
APM 2402
Research Areas
Mathematics of Information, Data, and Signals****************************