Jianhao Ma
About me
I'm currently a fifth-year Ph.D. student at Umich. I am fortunate to be advised by Professor Salar Fattahi. My research focus on developing efficient and robust machine learning models. I welcome future collaborations. Please feel free to contact me via Email!
I am on the 2024-2025 academic job market.
Research Interests
Preprints
Publications
Robust Sparse Mean Estimation via Incremental Learning [arxiv]
ICLR Workshop on Bridging the Gap Between Practice and Theory in Deep Learning 2024
Jianhao Ma, Rui Ray Chen, Yinghui He, Salar Fattahi, Wei Hu
Global Convergence of Sub-gradient Method for Robust Matrix Recovery: Small Initialization, Noisy Measurements, and Over-parameterization [paper][arxiv]
Journal of Machine Learning Research (JMLR) 2023
Jianhao Ma, Salar Fattahi
Behind the Scenes of Gradient Descent: A Trajectory Analysis via Basis Function Decomposition [paper][arxiv]
International Conference on Learning Representations (ICLR) 2023
Jianhao Ma, Lingjun Guo, Salar Fattahi
Blessing of Nonconvexity in Deep Linear Models: Depth Flattens the Optimization Landscape Around the True Solution [paper][arxiv]
Advances in Neural Information Processing Systems (NeurIPS) 2022 (Spotlight)
Katta Murty Prize for Best Research Paper on Optimization
Jianhao Ma, Salar Fattahi
Towards Understanding Generalization via Decomposing Excess Risk Dynamics [paper] [arxiv]
International Conference on Learning Representations (ICLR) 2022
Jiaye Teng*, Jianhao Ma*, Yang Yuan
Sign-RIP: A Robust Restricted Isometry Property for Low-rank Matrix Recovery [paper][arxiv]
Jianhao Ma, Salar Fattahi
NeurIPS Workshop on Optimization for Machine Learning, 2021
News
October 2024: I will attend INFORMS Annual Meeting.
March 2024: I will attend INFORMS Optimization Society Conference.
March 2024: Our paper has been accepted at ICLR BGPT workshop.
March 2024: Thrilled to receive Rackham Predoctoral Fellowship.
February 2024: I will intern at FAIR in Meta with Dr. Lin Xiao this summer.
February 2024: New paper about the global convergence of GD for unregularized matrix completion.
Services
Reviewer of IEEE TIT, IEEE TSP, ICML, NeurIPS, ICLR, AISTATS.
Quote
Always consider a problem under the minimum structure in which it makes sense. —— Gustave Choquet
|