Ruqi Zhang
Postdoctoral Researcher

Ruqi Zhang is currently a Postdoctoral Researcher at the Institute for Foundations of Machine Learning (IFML) at the University of Texas at Austin. Dr. Ruqi's main research interest focuses in building scalable, reliable and efficient probabilistic models for machine learning and data science. Furthermore, they focus on developing fast and robust inference methods with theoretical guarantees and their applications with deep neural networks on real-world big data. Before coming to UT, Dr. Ruqi completed their PhD in Statistics at Cornell University, and a B.S. in Mathematics from Renmin University of China.
Research
Research Areas:
Research Interests:
Dr. Ruqi's main research interest focuses in building scalable, reliable and efficient probabilistic models for machine learning and data science. Furthermore, they focus on developing fast and robust inference methods with theoretical guarantees and their applications with deep neural networks on real-world big data.
Select Publications
Ruqi Zhang, Yingzhen Li, Christopher De Sa, Sam Devlin, Cheng Zhang. 2021. https://utexas.box.com/s/atrazmvnpwnqbssyn8w4gpaqoyyxxg23.
Ruqi Zhang, A. Feder Cooper, Christopher De Sa. 2020. Asymptotically Optimal Exact Minibatch Metropolis-Hastings..
Ruqi Zhang, A. Feder Cooper, Christopher De Sa. 2020. AMAGOLD: Amortized Metropolis Adjustment for Efficient Stochastic Gradient MCMC.
Ruqi Zhang, Chunyuan Li, Jianyi Zhang, Changyou Chen, Andrew Gordon Wilson. 2020. Cyclical Stochastic Gradient MCMC for Bayesian Deep Learning..
Ruqi Zhang, Christopher De Sa. 2019. Poisson-Minibatching for Gibbs Sampling with Convergence Rate Guarantees..
Awards & Honors
2021 -
ICML Best Reviewers (Top 10%)
2020 -
Spotlight Rising Star in Data Science at University of Chicago
2020 -
NeurIPS Top 10% Reviewers Award
2019 -
NeurIPS Travel Grant