Department of Mathematics,
University of California San Diego

****************************

Seminar on Cheeger-Colding theory, Ricci flow, Einstein metrics, and Related Topics

Zilu Ma
UC San Diego

Tangent flows at infinity of 4-dimensional steady Ricci soliton singularity models

Abstract:

We will discuss joint work with R. Bamler, B. Chow, Y. Deng, and Y. Zhang on 4-dimensional steady Ricci soliton singularity models with 3-cylindrical tangent flows at infinity, as well as mention the somewhat parallel work with Y. Zhang on the existence of asymptotic shrinkers on steady solitons with $Ric \geq 0$.

-

Email bechow@ucsd.edu for zoom information

****************************

Department of Mathematics,
University of California San Diego

****************************

Math 288 - Probability \& Statistics Seminar

Tucker McElroy
US Census Bureau

Polyspectral Factorization

Abstract:

This presentation contributes to the theoretical background for a new quadratic prediction method for time series. We develop a theory of polyspectral factorization, providing new mathematical results for polyspectral densities. New bijections between a restricted space of higher-dimensional cepstral coefficients (where the restrictions are induced by the symmetries of the polyspectra) and the auto-cumulants are derived. Applications to modeling are developed; in particular, it is shown that semi-parametric nonlinear time series modeling can be accomplished by approximation of the cepstral representation of polyspectra.

-

****************************

Department of Mathematics,
University of California San Diego

****************************

Math 278B- Mathematics of Information, Data, and Signals Seminar

Roberto Imbuzeiro Oliveira
IMPA, Rio de Janeiro

Sample average approximation with heavier tails

Abstract:

Consider an ``ideal" optimization problem where constraints and objective function are defined in terms of expectations over some distribution P. The sample average approximation (SAA) -- a fundamental idea in stochastic optimization -- consists of replacing the expectations by an average over a sample from P. A key question is how much the solutions of the SAA differ from those of the original problem. Results by Shapiro from many years ago consider what happens asymptotically when the sample size diverges, especially when the solution of the ideal problem lies on the boundary of the feasible set. In joint work with Philip Thompson (Purdue), we consider what happens with finite samples. As we will see, our results improve upon the nonasymptotic state of the art in various ways: we allow for heavier tails, unbounded feasible sets, and obtain bounds that (in favorable cases) only depend on the geometry of the feasible set in a small neighborhood of the optimal solution. Our results combine ``localization" and ``fixed-point" type arguments inpired by the work of Mendelson with chaining-type inequalities. One of our contributions is showing what can be said when the SAA constraints are random.

-

Zoom link: https://msu.zoom.us/j/96421373881 (passcode: first prime number $>$ 100)

****************************