Jianhao Ma

alt text 

Jianhao Ma
Ph.D. candidate at Department of Industrial and Operational Engineering
University of Michigan, Ann Arbor, United States

Email: jianhao [at] umich [dot] edu
[Google Scholar][Curriculum Vitae]

About me

I'm currently a third-year Ph.D. candidate at Umich. I am fortunate to be advised by Professor Salar Fattahi. My research focus on understanding the pratical success of gradient-based algorithms based on trajectory analysis. I welcome future collaborations. Please feel free to contact me via Email!

Research Interests

  • General optimization and generalization in modern machine learning.

  • Matrix factorization and tensor decomposition.

  • Empirical process and its application.

Preprints

  • On the Optimization Landscape of Burer-Monteiro Factorization: When do Global Solutions Correspond to Ground Truth? [arxiv]
    Jianhao Ma, Salar Fattahi

Publications

  • Global Convergence of Sub-gradient Method for Robust Matrix Recovery: Small Initialization, Noisy Measurements, and Over-parameterization [arxiv]
    Journal of Machine Learning Research (JMLR) 2023
    Jianhao Ma, Salar Fattahi

  • Behind the Scenes of Gradient Descent: A Trajectory Analysis via Basis Function Decomposition [arxiv]
    International Conference on Learning Representations (ICLR) 2023
    Jianhao Ma, Lingjun Guo, Salar Fattahi

  • Blessing of Nonconvexity in Deep Linear Models: Depth Flattens the Optimization Landscape Around the True Solution [arxiv]
    Advances in Neural Information Processing Systems (NeurIPS) 2022 (Spotlight)
    Jianhao Ma, Salar Fattahi

  • Towards Understanding Generalization via Decomposing Excess Risk Dynamics [paper] [arxiv]
    International Conference on Learning Representations (ICLR) 2022
    Jiaye Teng*, Jianhao Ma*, Yang Yuan

  • Sign-RIP: A Robust Restricted Isometry Property for Low-rank Matrix Recovery [paper][arxiv]
    Jianhao Ma, Salar Fattahi
    NeurIPS Workshop on Optimization for Machine Learning, 2021

News

Conference Services

Reviewer of ICML, NeurIPS, ICLR, AISTATS.

Quate

Always consider a problem under the minimum structure in which it makes sense. —— Gustave Choquet