Department of Mathematics,
University of California San Diego

****************************

Math 243 - Functional Analysis Seminar

Dilian Yang
University of Windsor

Self-similar k-graph C*-algebras

Abstract:

A self-similar k-graph is a pair consisting of a discrete group and a k-graph, such that the group acts on the k-graph self-similarly. For such a pair, one can associate it a universal C*-algebra, called the self-similar k-graph C*-algebra. This class of C*-algebras embraces many important and interesting C*-algebras, such as the higher rank graph C*-algebras of Kumjian-Pask, the Katsura algebra, the Nekrashevych algebra, and the Exel-Pardo algebra. In this talk, I will present some results about those C*-algebras, which are based on joint work with Hui Li.

-

****************************

Department of Mathematics,
University of California San Diego

****************************

Center for Computational Mathematics Seminar

Amir Sagiv
Columbia University

Local and optimal transport perspectives on uncertainty propagation

Abstract:

In many scientific areas, a deterministic model (e.g., a differential equation) is equipped with parameters. In practice, these parameters might be uncertain or noisy, and so an honest model should provide a statistical description of the quantity of interest. Underlying this computational question is a fundamental one - If two ``similar" functions push-forward the same measure, are the new resulting measures close, and if so, in what sense? I will first show how the probability density function (PDF) can be approximated, using spectral and local methods, and present applications to nonlinear optics. We will then discuss the limitations of PDF approximation, and present an alternative Wasserstein-distance formulation of this problem, which yields a much simpler theory.

-

Zoom ID 939 3177 8552

****************************

Department of Mathematics,
University of California San Diego

****************************

Zoom for Thought

Sam Spiro
UC San Diego

Introduction to Spectral Graph Theory

Abstract:

Given a graph $G$, one can compute the eigenvalues of its adjacency matrix $A_G$. Remarkably, these eigenvalues can tell us quite a bit about the structure $G$. More generally, spectral graph theory consists of taking a graph $G$, associating to it a matrix $M_G$, and then using algebraic properties of $M_G$ to recover combinatorial information about $G$. In this talk we discuss some of the more common applications of spectral graph theory, as well as a very simple proof of the sensitivity conjecture due to Huang.

-

Please see email with subject ``Zoom for Thought Information.''

****************************

Department of Mathematics,
University of California San Diego

****************************

Math 258 - Differential Geometry Seminar

Zilu Ma
UC San Diego

Tangent flows at infinity of 4-dimensional steady Ricci soliton singularity models

Abstract:

According to Perelman's work on Ricci flow with surgeries in dimension 3, we know that it is important to understand at least qualitative behaviors of singularity formation in order to perform surgeries. The situation in dimension 4 is much more complicated as some new types of singularity models may arise and the classification of the singularity models is far from complete. We expect that the singularity models should be solitons, self-similar solutions to the Ricci flow, and we expect that most of singularity models are shrinking gradient solitons with possible singularities by the recent work of Richard Bamler. Steady gradient Ricci solitons may also arise as singularity models and they are related to shrinking solitons with quadratic curvature growth. In a recent joint work with R. Bamler, B. Chow, Y. Deng, and Y. Zhang, we managed to classify tangent flows at infinity which can be viewed as a blow-down of 4 dimensional steady gradient Ricci soliton singularity models. When the tangent flow at infinity is 3-cylindrical, we can give very good qualitative characterization of such steady solitons. We will also mention the somewhat parallel work with Y. Zhang on the existence of asymptotic shrinkers on steady solitons with nonnegative Ricci curvature.

-

Zoom ID 917 6172 6136

****************************

Department of Mathematics,
University of California San Diego

****************************

Math 209 - Number Theory Seminar

Peter Koymans
MPIM

Malle's conjecture for nonic Heisenberg extensions

Abstract:

In 2002 Malle conjectured an asymptotic formula for the number
of $G$-extensions of a number field $K$ with discriminant bounded by
$X$. In this talk I will discuss recent joint work with Etienne Fouvry
on this conjecture. Our main result proves Malle's conjecture in the
special case of nonic Heisenberg extensions.

-

****************************

Department of Mathematics,
University of California San Diego

****************************

Math 278B - Mathematics of Information, Data, and Signals Seminar

Yi Ma
UC Berkeley

Deep Networks from First Principles

Abstract:

In this talk, we offer an entirely ``white box'' interpretation of deep (convolution) networks from the perspective of data compression (and group invariance). In particular, we show how modern deep layered architectures, linear (convolution) operators and nonlinear activations, and even all parameters can be derived from the principle of maximizing rate reduction (with group invariance). All layers, operators, and parameters of the network are explicitly constructed via forward propagation, instead of learned via back propagation. All components of so-obtained network, called ReduNet, have precise optimization, geometric, and statistical interpretation. There are also several nice surprises from this principled approach: it reveals a fundamental tradeoff between invariance and sparsity for class separability; it reveals a fundamental connection between deep networks and Fourier transform for group invariance – the computational advantage in the spectral domain (why spiking neurons?); this approach also clarifies the mathematical role of forward propagation (optimization) and backward propagation (variation). In particular, the so-obtained ReduNet is amenable to fine-tuning via both forward and backward (stochastic) propagation, both for optimizing the same objective.
\\
\\
This is joint work with students Yaodong Yu, Ryan Chan, Haozhi Qi of Berkeley, Dr. Chong You now at Google Research, and Professor John Wright of Columbia University.

-

Zoom link: https://msu.zoom.us/j/96421373881 (passcode: first prime number $>$ 100)

****************************

Department of Mathematics,
University of California San Diego

****************************

AWM Colloquium

Elham Izadi
UC San Diego

Some fun facts about cubics

Abstract:

Cubic hypersurfaces are the zero sets of homogeneous polynomials of degree 3. They have been, still are, and probably will be for quite some time, the subject of a lot of research. I will survey a few well-known and fun facts about cubic hypersurfaces and will also mention some open problems.

-

Location / Remote Access URL https://ucsd.zoom.us/j/94080232559

****************************