Michael Crawshaw

Email | CV | Google Scholar | GitHub | Twitter | LinkedIn

Michael

About Me

I am a final year Ph.D. student in the department of Computer Science at George Mason University, advised by Professor Mingrui Liu. My research is in the theory of optimization for machine learning. During my PhD, I've worked on distributed optimization/federated learning, optimization under non-standard assumptions such as relaxed smoothness, and optimization under large step sizes/Edge of Stability. Before studying at George Mason, I received a B.S. in mathematics and computer science from Ohio State University.

I spent Summer 2025 in New York City as an intern at the Flatiron Institute's Center for Computational Mathematics, working with Robert Gower. We formalized many variations of the Muon optimizer as a type of non-Euclidean gradient descent, and used this formalization to significantly improve robustness to learning rate tuning through model truncation (preprint here).

Publications

Service

Teaching

Misc