Printable PDF
Department of Mathematics,
University of California San Diego

****************************

Math 278B: Mathematics of Information, Data, and Signals

Chris Camaño

CalTech

Randomized Tensor Networks For Product Structured Data

Abstract:

In recent years, tensor networks have emerged as a powerful low-rank approximation framework for addressing exponentially large data science problems without requiring exponential computational resources. In this talk, we demonstrate how tensor networks, when combined with accelerations from randomized numerical linear algebra (rNLA), can enable the efficient representation and manipulation of large-scale, complex datasets originating from quantum physics, high-dimensional function approximation, and neural network compression. We will start by describing how to construct a tensor network directly from input data. Building on this foundation, we then describe a new randomized algorithm called Successive Randomized Compression (SRC) that asymptotically accelerates the tensor network analog of matrix-vector multiplication using the randomized singular value decomposition. As a demonstration, we present examples showing how tensor network based simulations of quantum dynamics in 2^100 dimensions can be performed on a personal laptop.

April 4, 2025

11:00 AM

APM 2402

Research Areas

Mathematics of Information, Data, and Signals

****************************