Biography

I am a researcher and writer working at the intersection of machine learning theory, differential geometry, and statistical physics. My work focuses on building rigorous mathematical frameworks for understanding generalization in deep neural networks.

I received my Ph.D. from [University Name] in 2019, advised by [Professor Name], where my dissertation examined the geometric structure of neural tangent kernels and their implications for learning dynamics. Prior to that, I completed undergraduate studies in Mathematics and Physics.

Currently, I am a research scientist at [Institution], where I lead a small group studying implicit regularization, loss landscape geometry, and the connections between deep learning and classical statistical mechanics.

Research Interests

Machine Learning Theory Differential Geometry Statistical Physics Information Theory Optimization Neural Tangent Kernels Generalization Theory Bayesian Inference

Selected Publications

Geometric Perspectives on Neural Network Generalization

A. Researcher, B. Collaborator, C. Coauthor

Journal of Machine Learning Research, 2024, Vol. 25, pp. 1–42

Information-Theoretic Bounds for Deep Learning with Applications to Implicit Regularization

A. Researcher, D. Author

Advances in Neural Information Processing Systems (NeurIPS), 2023, pp. 8432–8445

On the Intrinsic Dimensionality of Parameter Spaces in Overparameterized Models

A. Researcher, E. Coauthor, F. Author

International Conference on Machine Learning (ICML), 2022, pp. 17841–17855

Neural Tangent Kernels and the Geometry of Infinite Width Networks

A. Researcher

Ph.D. Thesis, [University Name], 2019

Academic Service

Reviewer for NeurIPS, ICML, ICLR, JMLR, and AISTATS. Area chair for NeurIPS 2024. Organizer of the weekly seminar series on mathematical foundations of machine learning.