Research
My research is broadly in the mathematics of data science. I study the properties and behaviour of random mathematical structures and algorithms, and aim to use these insights to develop tools and models that allow us to better work with and understand large-scale, complex data.
My research interests mainly lie at the intersection of probability, statistics and optimisation. More specifically, my interests include high-dimensional probability, random matrices, randomised algorithms, numerical linear algebra, and machine learning.
Papers
-
Discrete error dynamics of mini-batch gradient descent for least squares regression
with Rishi Sonthalia and Elizaveta Rebrova
Preprint, 2024 [arXiv] -
On Regularization via Early Stopping for Least Squares Regression
with Rishi Sonthalia and Elizaveta Rebrova
Preprint, 2024 [arXiv] -
On Approximating the Potts Model with Contracting Glauber Dynamics
with Roxanne He
Preprint, 2024 [arXiv] -
A subspace constrained randomized Kaczmarz method for structure or external knowledge exploitation
with Elizaveta Rebrova
Linear Algebra and its Applications, vol. 698, pp. 220–260 (2024) [journal] [arXiv]
Miscellaneous
- Markov chains, mixing times, and cutoff
Undergraduate honours thesis [pdf]
Presentations
-
Conference on the Mathematical Theory of Deep Neural Networks (DeepMath 2024), University of Pennsylvania, Philadelphia: “Error dynamics of mini-batch gradient descent with random reshuffling for least squares”, November 2024 [poster]
-
CUNY Graduate Center Harmonic Analysis & PDE Seminar: “A subspace constrained randomized Kaczmarz method”, November 2024
-
SIAM Conference on Mathematics of Data Science (MDS24), Atlanta: “A Subspace Constrained Randomized Kaczmarz Method for Structure or External Knowledge Exploitation”, October 2024 (poster)
-
NSF CompMath PI Meeting, University of Washington, Seattle: “A Subspace Constrained Randomized Kaczmarz Method for Structure or External Knowledge Exploitation”, July 2024 [poster]