Department of Mathematics,
University of California San Diego

****************************

Math 243 - Functional Analysis Seminar

Rolando de Santiago
UCLA

$L^2$ Betti numbers and s-malleable deformations

Abstract:

A major theme in the study of von Neumann algebras is to investigate which structural aspects of the group extend to its von Neumann algebra. I present recent progress made by Dan Hoff, Ben Hayes, Thomas Sinclair and myself in the case where the group has positive first $L^2$ Betti number. I will also expand on our analysis of s-malleable deformations and their relation to cocylces which forms the foundation of our work.

-

AP&M 6402

****************************

Department of Mathematics,
University of California San Diego

****************************

Food For Thought Seminar

Eric Lybrand
UCSD

Indoor Localization

Abstract:

Ever tried using Google Maps in a parking garage? It sucks. That's because current navigational systems rely heavily on GPS satellites which can't ``see'' you when you're inside a building or multiple stories below ground. I'll be talking about one way of trying to localize an internet connected device using wireless signal strength. If the engineering aspect of this problem didn't scare you off already then look forward to a brief discussion about how we can use wavelets to address this question!

-

AP&M 7321

****************************

Department of Mathematics,
University of California San Diego

****************************

Math 258 - Differential Geometry

Amir Mohammadi
UCSD

Geodesic planes in hyperbolic 3-manifolds

Abstract:

We will discuss the possible closures of geodesic planes in a hyperbolic 3-manifold M. When M has finite volume Shah and Ratner (independently) showed that a strong rigidity phenomenon holds, and in particular such closures are always properly immersed submanifolds of M with finite area. We show that a similar rigidity phenomenon holds for a class of infinite volume manifolds. This is based on joint works with C. McMullen and H. Oh.

-

AP&M 5829

****************************

Department of Mathematics,
University of California San Diego

****************************

Math 278C - Optimization and Data Science Seminar

Jane Ye
University of Victoria

Calmness and its applications to linear convergence of some first order methods for nonsmooth optimization problems

Abstract:

Calmness/metric subregularity for set-valued maps is a powerful stability concept in variational analysis. In this talk we first discuss the concept of calmness/metric subregularity and sufficient conditions for verifying it. Then we introduce a perturbation technique for conducting linear convergence analysis of various first-order algorithms for a class of nonsmooth optimization problems which minimizes the sum of a smooth function and a nonsmooth function by using the proximal gradient method as an example. This new perturbation technique enables us to provide some concrete sufficient conditions for checking linear convergence for very general problems where the nonconvexity may appear in each component of the objective function and leads to some improvement for the linear convergence results even for the convex case.

-

AP&M B412

****************************

Department of Mathematics,
University of California San Diego

****************************

Math 243 - Functional Analysis Seminar

Ian Charlesworth
UC Berkeley

Free Stein Information

Abstract:

I will speak on recent joint work with Brent Nelson, where we introduce a free probabilistic regularity quantity we call the free Stein information. The free Stein information measures in a certain sense how close a system of variables is to admitting conjugate variables in the sense of Voiculescu. I will discuss some properties of the free Stein information and how it relates to other common regularity conditions.

-

AP&M 6402

****************************

Department of Mathematics,
University of California San Diego

****************************

Math 248 - Analysis Seminar

Zaher Hani
University of Michigan

On the kinetic description of the long-time behavior of dispersive PDE

Abstract:

Wave turbulence theory claims that at very long timescales, and in appropriate limiting regimes, the effective behavior of a nonlinear dispersive PDE on a large domain can be described by a kinetic equation called the ``wave kinetic equation''. This is the wave-analog of Boltzmann's equation for particle collisions. We shall consider the nonlinear Schrodinger equation on a large box with periodic boundary conditions, and explore some of its effective long-time behaviors at time scales that are shorter than the conjectured kinetic time scale, but still long enough to exhibit the onset of the kinetic behavior. (This is joint work with Tristan Buckmaster, Pierre Germain, and Jalal Shatah).

-

AP&M 7321

****************************

Department of Mathematics,
University of California San Diego

****************************

Math 209 - Number Theory

Nolan Wallach
UCSD

The meromorphic continuation of smooth Eisenstein series

Abstract:

In his monumental book 'On the Functional Equations satisfied by Eisenstein Series' Langlands proved that the K-finite Eisenstein series, initially defined, convergent and holomorphic in appropriate open tube of the parameter space can be meromorphically
continued to the entire parameter space. The K-finiteness was critical to his proof of the theorem. In this lecture I will show how to use Langlands' theorem to prove the meromorphic continuation for smooth Eisenstein series. These results are valid in the full context of Langlands' theorem but I will only talk about arithmetic groups for which the definitions are easier. (Indeed, Langlands' definition of the groups that that would be studied was only completed at the end of the induction in his notorious chapter 7).

-

AP&M 7321

****************************

Department of Mathematics,
University of California San Diego

****************************

Math 295 - Mathematics Colloquium

Roman Vershynin
UC Irvine

Mathematics of deep learning

Abstract:

Deep learning is a rapidly developing area of machine learning, which uses artificial neural networks to perform learning tasks. Although mathematical description of neural networks is simple, theoretical explanation of spectacular performance of deep learning remains elusive. Even the most basic questions about remain open. For example, how many different functions can a neural network compute? Jointly with Pierre Baldi (UCI CS) we discovered a general capacity formula for all fully connected boolean networks. The formula predicts, counterintuitively, that shallow networks have greater capacity than deep ones. So, mystery remains.

-

AP&M 6402

****************************

Department of Mathematics,
University of California San Diego

****************************

Graduate Student Combinatorics Seminar

Sam Spiro
UCSD

Roth's Theorem

Abstract:

Szemeredi's theorem states that every set of integers $A$ with positive density contains an arithmetic progression of length $k$ for any $k\ge 3$. The case $k=3$ was originally proven by Roth. In this talk we go through the proof of Roth's theorem, as well as other related ideas such as Salem sets and Gower's norms.

-

AP&M 5402

****************************