# Speakers for DoMSS Spring 2026 ## 1/26 (in person): Alex Alberts, Purdue <!-- photo here --> ![Screenshot 2026-01-08 at 10.08.12 AM](https://hackmd.io/_uploads/r1OYVDaNbl.png) **Career stage:** Postdoc **Research area:** Machine Learning, Stochastics **Website:** https://www.linkedin.com/in/alex-alberts-930a4b262/ **Talk Title:** Path integral methods for Bayesian inference **Abstract:** Inverse problems in infinite dimensions are ubiquitously encountered across the scientific disciplines. These problems are defined by the need to reconstruct continuous fields from incomplete, noisy measurements, which oftentimes leads to ill-posed problems. Almost universally, the solutions to these problems are constructed under, or can be viewed as a limit case of, a Bayesian framework. However, in the infinite-dimensional setting, the theory is largely restricted to the Gaussian case due to some technical difficulties. The most notable being that the Lebesgue measure does not exist on infinite-dimensional spaces. As a result, we often resort to Gaussian measures so that the prior and resulting posterior remain well-defined. As an alternative, we explore the use of the Feynman path integral formalism for Bayesian inference. By posing inverse problems with path integrals, the resulting forms resemble the finite-dimensional setting, which allows for intuitive techniques to be derived. In this talk, we discuss the theory, numerical methods, and some real-world applications involved under this viewpoint. --- ## 2/2 (in person): Traian Pirvu, McMaster University <!-- photo here --> ![Screenshot 2026-01-08 at 9.57.18 AM](https://hackmd.io/_uploads/HyoxGwTNbe.png) **Career stage:** Associate Professor **Research area:** Actuarial Science **Website:** https://experts.mcmaster.ca/display/pirvut **Talk Title:** Time Consistent Portfolio Management **Abstract:** The Merton portfolio management problem is studied within a framework that incorporates stochastic volatility, a non-constant time discount rate, and power utility. This setting gives rise to time inconsistency, which is addressed by adopting subgame-perfect strategies. These strategies are characterized through an extended Hamilton–Jacobi–Bellman (HJB) equation, which is solved using a fixed-point iteration scheme. The solution proceeds in two stages: first, the utility-weighted discount rate is introduced and identified as the fixed point of a suitable operator; second, the value function is obtained by solving a linear parabolic partial differential equation. Numerical experiments illustrate the influence of the time-varying discount rate on subgame-perfect strategies and their outcomes. --- ## 2/9 (virtual): Wei Zhu, Georgia Tech <!-- photo here --> ![Screenshot 2026-01-08 at 9.57.40 AM](https://hackmd.io/_uploads/rygMMvTVbg.png) **Career stage:** Assistant Professor **Research area:** Scientific Machine Learning **Website:** https://sites.google.com/view/weizhumath/home **Talk Title:** Structure-preserving generative models **Abstract:** In this talk, I will discuss how intrinsic structures of probability distributions---such as (approximate) group symmetries, multimodality, and low dimensionality---can be systematically incorporated into generative models to improve data efficiency. In the first part, I will focus on generative adversarial networks (GANs) and explain how group symmetry can be embedded into their architecture. A central theme will be a precise analysis of the resulting reduction in sample complexity, that is, the number of samples required to effectively learn the target distribution. Somewhat surprisingly, the quantitative gains are not always aligned with naive intuition. In the second part, I will turn to score-based diffusion models. I will present a framework in which structural information is integrated into the noising dynamics itself, so that the diffusion process reflects multimodality, low dimensionality, and approximate symmetries inherent in the data. This modification leads to improved learning efficiency while preserving the flexibility and scalability of diffusion-based approaches. I will also briefly discuss recent extensions of this framework to latent-space formulations. --- ## 2/16 (in person): Yeonjong Shin, North Carolina State University <!-- photo here --> ![Screenshot 2026-01-08 at 9.58.00 AM](https://hackmd.io/_uploads/HyV7MPa4Ze.png) **Career stage:** Assistant Professor **Research area:** Scientific Machine Learning **Website:** https://sites.google.com/site/shinmathematics/ **Talk Title:** From Theory to Practice: Mathematical Approaches to Scientific Machine Learning **Abstract:** Machine learning (ML) has achieved unprecedented empirical success in diverse applications. It has been applied to solve scientific and engineering problems and has emerged as a new research field: Scientific Machine Learning (SciML). However, many ML techniques are highly complex and sophisticated, often requiring extensive trial-and-error experimentation and problem-specific techniques to be implemented effectively. This complexity frequently poses significant challenges for scientific research, including reproducibility and rigor. This talk explores mathematical approaches, offering more principled and reliable methodologies for SciML. The first part will present recent efforts advancing the predictive power of physics-informed machine learning through robust training/optimization methods. This includes an effective training method for multivariate neural networks, namely, Active Neuron Least Squares (ANLS), and a two-step training method for deep operator networks. The second part is about how to embed the first principles of physics into neural networks. I will present a general framework for designing NNs that obey the first and second laws of thermodynamics. The framework not only provides flexible ways of leveraging available physics information but also results in expressive NN architectures. I will also present an intriguing phenomenon of this framework when it is applied to latent-space dynamics identification, where a correlation emerges between the entropy production rate in the latent space and the behavior of the full-state solution. --- ## 2/23 (in person): Nicole Yang, University of Tennessee Knoxville <!-- photo here --> ![Screenshot 2026-01-08 at 9.58.49 AM](https://hackmd.io/_uploads/SyBLMvTEWg.png) **Career stage:** Assistant Professor **Research area:** Scientific Machine Learning **Website:** https://nicoletyang.github.io **Talk Title:** A control perspective towards continuous-time learning **Abstract:** In this talk, we consider continuous-time learning schemes motivated by optimal control. We first discuss the inference of stochastic dynamical systems through only (nonlinear) noisy measurements. Building on a stochastic control formulation, we construct a generative model that maps the reference measure to the posterior measure through variational inference of a controlled diffusion process. This enables efficient generations of data-assimilated trajectories with applications in system identification and time series prediction. In the second part of the talk, we discuss a mixed precision explicit ODE solver and a custom backpropagation scheme and show their effectiveness in a range of learning tasks. Our scheme uses low-precision computations for evaluating the velocity, parameterized by the neural network, while stability is provided by a custom dynamic adjoint scaling and by accumulating the solution and gradients in higher precision. --- ## 3/2 (virtual): Bert de Jong, Lawrence Berkeley National Laboratory <!-- photo here --> ![Screenshot 2026-01-08 at 9.59.44 AM](https://hackmd.io/_uploads/B1ntzPTEbl.png) **Career stage:** Research Scientist **Research area:** Quantum Computing, Quantum Systems **Website:** https://profiles.lbl.gov/22078-bert-de-jong --- ## 3/16 (in person): Benjamin Zhang, University of North Carolina <!-- photo here --> ![Screenshot 2026-01-08 at 10.00.49 AM](https://hackmd.io/_uploads/B16TfPTEbx.png) **Career stage:** Postdoc, *(Incoming Assistant Professor, Math Dept. @ Rutgers University)* **Research area:** Sampling, Control, Machine Learning **Website:** https://benjzhang.com --- ## 3/23 (in person): Vincent Martinez, CUNY Hunter College <!-- photo here --> ![Screenshot 2026-01-08 at 10.02.13 AM](https://hackmd.io/_uploads/ry-X7DTV-x.png) **Career stage:** Professor **Research area:** Partial Differential Equations, Data Assimilation **Website:** https://www.hunter.cuny.edu/people/vincent-martinez/ --- ## 3/30 (virtual): Jiequn Han, Flatiron Institute <!-- photo here --> ![Screenshot 2026-01-08 at 10.02.53 AM](https://hackmd.io/_uploads/H15rXD64be.png) **Career stage:** Research Scientist **Research area:** Machine Learning **Website:** https://users.flatironinstitute.org/~jhan/ --- ## 4/6 (in person): Lise-Marie Imbert-Gerard, University of Arizona <!-- photo here --> ![Screenshot 2026-01-08 at 10.03.46 AM](https://hackmd.io/_uploads/S1CumPaEWg.png) **Career stage:** Professor **Research area:** Numerical Partial Differential Equations **Website:** https://www.math.arizona.edu/people/lmig --- ## 4/13 (in person): Valeria Barra, San Diego State University <!-- photo here --> ![Screenshot 2026-01-08 at 10.04.13 AM](https://hackmd.io/_uploads/HyFqmva4be.png) **Career stage:** Assistant Professor **Research area:** Numerical Partial Differential Equations **Website:** https://valeriabarra.org --- ## 4/20 (in person): Amanda Howard, Pacific Northwest National Laboratory <!-- photo here --> ![Screenshot 2026-01-08 at 10.04.48 AM](https://hackmd.io/_uploads/S1hnmDpNWl.png) **Career stage:** Staff Scientist **Research area:** Scientific Machine Learning **Website:** https://amanda-howard.github.io ## 4/27 (virtual): Roel Van Beeumen, Berkeley Lab <!-- photo here --> ![Screenshot 2026-02-18 at 12.45.32 PM](https://hackmd.io/_uploads/HydJD5Xdbg.png) **Career stage:** Staff Scientist **Research area:** Quantum Computing **Website:** https://www.roelvanbeeumen.be/home.html **Talk Title:** Quantum Computing Meets Numerical Linear Algebra