Research

My research lies at the intersection of optimization, statistics, and machine learning. My work focuses on solving fundamental challenges and application problems in data science

Specifically, I analyze the optimization conditioning of convex and nonconvex problems (such as semidefinite programming and Burer-Monteiro approach for matrix recovery) under statistical assumptions, design statistically and computationally efficient optimization algorithms for data science applications (such as phase retrieval and matrix completion), and study the interplay between model overparametrization, algorithmic regularization, and model generalization in classical and modern machine learning models (such as mixture models and neural networks).

You can check my google scholar for a full list of my work.  

Semidefinite programming

Revisiting Spectral Bundle Methods: Primal-dual (Sub)linear Convergence Rates [arxiv]  [slides]

Ding and Grimmer         


A Strict Complementarity Approach to Error Bound and Sensitivity of Solution of Conic Programs [arxiv]

Ding and Udell      


An Optimal-Storage Approach to Semidefinite Programming using Approximate Complementarity [arxiv] [slides]

 Ding, Yurtsever, Cevher, Tropp, and Udell 


On the simplicity and conditioning of low rank semidefinite programs [arxiv

Ding and Udell                                                                                                


Higher-Order Cone Programming [arxiv] [slides]

Ding and Lim                  

Statistical nonconvex optimization

Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence [arxiv

Charisopoulos, Chen, Davis, Diaz, Ding, and Drusvyatskiy                   


Low-rank matrix recovery with non-quadratic loss: projected gradient method and regularity projection oracle [arxiv

Ding, Zhang, and Chen                                                         


Leave-one-out Approach for Matrix Completion: Primal and Dual Analysis [arxiv] [slides]

Ding and Chen                                                                                                   


Factor Group-Sparse Regularization for Efficient Low-Rank Matrix Recovery [arxiv] 

Fan, Ding, Chen, and Udell                                        

Overparametrization

             Frank-Wolfe

kFW: A Frank-Wolfe style algorithm with stronger subproblem oracles [arxiv] [slides]

Ding, Fan, and Udell 


Spectral Frank-Wolfe Algorithm: Strict Complementarity and Linear Convergence [arxiv]

Ding, Fei, Xu, and Yang                                                            


Frank-Wolfe Style Algorithms for Large Scale Optimization [arxiv]

Ding and Udell