Department of Mathematics,
University of California San Diego
****************************
Math 243 - Functional Analysis Seminar
Pieter Spaas
UCLA
Furstenberg-Zimmer structure theory for actions on von Neumann algebras
Abstract:
In classical ergodic theory, compact and weakly mixing actions/extensions have been well-studied. The main structural result from Furstenberg and Zimmer states that every action can be written as "a weakly mixing extension of a tower of compact extensions". We will discuss some of these classical results and their motivation, and consider similar notions for actions on von Neumann algebras which have been defined throughout the years. We will then complete (part of) the picture by establishing equivalence of several such notions, followed by some consequences and open questions. This is partially based on joint work with Asgar Jamneshan.
-
APM 7218 and Zoom
Email djekel@ucsd.edu for Zoom info
APM 7218 and Zoom
Email djekel@ucsd.edu for Zoom info
****************************
Department of Mathematics,
University of California San Diego
****************************
Center for Computational Mathematics Seminar
Katy Craig
UC Santa Barbara
A Blob Method for Diffusion and Applications to Sampling and Two Layer Neural Networks
Abstract:
Given a desired target distribution and an initial guess of that distribution, composed of finitely many samples, what is the best way to evolve the locations of the samples so that they accurately represent the desired distribution? A classical solution to this problem is to allow the samples to evolve according to Langevin dynamics, a stochastic particle method for the Fokker-Planck equation. In today’s talk, I will contrast this classical approach with a deterministic particle method corresponding to the porous medium equation. This method corresponds exactly to the mean-field dynamics of training a two layer neural network for a radial basis function activation function. We prove that, as the number of samples increases and the variance of the radial basis function goes to zero, the particle method converges to a bounded entropy solution of the porous medium equation. As a consequence, we obtain both a novel method for sampling probability distributions as well as insight into the training dynamics of two layer neural networks in the mean field regime.
-
AP&M 2402
Zoom ID 954 6624 3503
AP&M 2402
Zoom ID 954 6624 3503
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 292 - Topology Seminar
Dana Hunter
University of Oregon
The Curtis-Wellington spectral sequence through cohomology
Abstract:
In this talk, we will discuss an unstable approach to studying stable homotopy groups as pioneered by Curtis and Wellington. Using the Barratt-Priddy-Quillen theorem, we can identify the (co)homology of $BS_\infty$ with the (co)homology of the base point component of the loop space which represents stable homotopy. Using cohomology instead of homology to exploit the nice Hopf ring presentation of Giusti, Salvatore, and Sinha for the cohomology of symmetric groups, we find a width filtration, whose subquotients are simple quotients of Dickson algebras, which thus give a new filtration of stable homotopy. We make initial calculations and determine towers in the resulting width spectral sequence. We also make calculations related to the image of J and conjecture that it is captured exactly by the lowest filtration in the width spectral sequence.
-
https://ucsd.zoom.us/j/99777474063
Password: topology
https://ucsd.zoom.us/j/99777474063
Password: topology
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 292 - Topology Seminar (student seminar series on motivic homotopy theory)
Shangjie Zhang
UCSD
Stable $A^1$ homotopy theory of $S^1$ spectra
-
https://ucsd.zoom.us/j/99777474063
Password: topology
https://ucsd.zoom.us/j/99777474063
Password: topology
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 278C - Optimization and Data Science
Zheng Zhang
UCSB
The Interplay of Compressed Training and Uncertainty-Aware Learning
Abstract:
Deep neural networks have been widely used in massive engineering domains, but the training and deployment of neural networks are subject to many fundamental challenges. In the training phase, the large-scale optimization often consumes a huge amount of computing and energy resources. In practical deployment, we often need the capability of uncertainty quantification to ensure the safe operations in an uncertain environment. To address the first challenge, we need compressed training, but it is hard to determine the compression ratio automatically in the training phase. To address the second challenge, we often use Bayesian learning models, but the resulting uncertainty-aware model often leads to massive model copies which cause huge memory and computing overhead.
In this talk, we show that the interplay of compressed training and Bayesian learning can provide more sustainable neural network models. Firstly, we investigate end-to-end tensor compressed training. This approach can offer orders-of-magnitude parameter reduction in the training phase, but it is hard to determine the tensor rank and model complexity automatically. We show that efficient Bayesian formulation and solver can be developed to address this major challenge, enabling high-accuracy end-to-end compressed training as well as energy-efficient on-device training. Secondly, we investigate MCMC-type Bayesian training. Here the main challenge is how to use a small number of model copies to accurately represent model uncertainties. We provide an online and provable online sample thinning method based on kernelized Stein discrepancy. This method can reduce the model copies on the fly, and offers orders-of-magnitude memory and latency savings in the inference.
Speaker’s Bio:
Dr. Zheng Zhang is an Assistant Professor of Electrical and Computer Engineering at University of California, Santa Barbara. He received his PhD degree in Electrical Engineering and Computer Science from MIT in 2015. His research is focused on uncertainty quantification and tensor computation, with applications to multi-domain design automation, sustainable and trustworthy AI systems. He received the ACM SIGDA Outstanding New Faculty Award, IEEE CEDA Early CAREER Award, NSF Early Career Award, and three best journal paper awards from IEEE Transactions in the EDA research field. He is the receipt of ACM SIGDA Outstanding Dissertation Award in 2016, and MIT Microsystems Technology Lab PhD Dissertation Award in 2015.
-
https://ucsd.zoom.us/j/93696624146
Meeting ID: 936 9662 4146
Password: OPT2022SP
https://ucsd.zoom.us/j/93696624146
Meeting ID: 936 9662 4146
Password: OPT2022SP
****************************
Department of Mathematics,
University of California San Diego
****************************
Advancement to Candidacy
Tai-Hsuan Chung
UCSD
Semistable Reduction in Positive Characteristic
-
APM 7218
APM 7218
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 211B - Group Actions Seminar
Seonhee Lim
Seoul National University
Complex continued fractions and central limit theorem for rational trajectories
Abstract:
In this talk, we will first introduce the complex continued fraction maps associated with some imaginary quadratic fields ($d=1, 2, 3, 7, 11$) and their dynamical properties. Baladi-Vallee analyzed (real) Euclidean algorithms and proved the central limit theorem for rational trajectories and a wide class of cost functions measuring algorithmic complexity. They used spectral properties of an appropriate bivariate transfer operator and a generating function for certain Dirichlet series whose coefficients are essentially the moment generating function of the cost on the set of rationals. We extend the work of Baladi-Vallee for complex continued fraction maps mentioned above. (This is joint work with Dohyeong Kim and Jungwon Lee.)
-
AP&M 6402
Zoom ID 967 4109 3409
Email an organizer for the password
AP&M 6402
Zoom ID 967 4109 3409
Email an organizer for the password
****************************
Department of Mathematics,
University of California San Diego
****************************
Math 209 - Number Theory Seminar
Anthony Kling
U. Arizona
Comparison of Integral Structures on the Space of Modular Forms of Full Level $N$
Abstract:
Let $N\geq3$ and $r\geq1$ be integers and $p\geq2$ be a prime such that $p\nmid N$. One can consider two different integral structures on the space of modular forms over $\mathbb{Q}$, one coming from arithmetic via $q$-expansions, the other coming from geometry via integral models of modular curves. Both structures are stable under the Hecke operators; furthermore, their quotient is finite torsion. Our goal is to investigate the exponent of the annihilator of the quotient. We will apply results due to Brian Conrad to the situation of modular forms of even weight and level $\Gamma(Np^{r})$ over $\mathbb{Q}_{p}(\zeta_{Np^{r}}
-
Pre-talk at 1:20 PM
APM 6402 and Zoom;
See https://www.math.ucsd.edu/~nts /
Pre-talk at 1:20 PM
APM 6402 and Zoom;
See https://www.math.ucsd.edu/~nts
****************************
Department of Mathematics,
University of California San Diego
****************************
Postdoc Seminar
Caroline Moosmueller
UCSD
Optimal transport in machine learning
Abstract:
In this talk, I will give an introduction to optimal transport, which has evolved as one of the major frameworks to meaningfully compare distributional data. The focus will mostly be on machine learning, and how optimal transport can be used efficiently for clustering and supervised learning tasks. Applications of interest include image classification as well as medical data such as gene expression profiles.
-
APM B402A
APM B402A
****************************