
A Brief Introduction About Me
Hi, I'm Yuyang Qiu (仇裕洋). I graduated with a PhD in Industrial and Systems Engineering from Rutgers University under the supervision of Dr. Farzad Yousefian.
My research spans federated learning (FL), hierarchical optimization, as well as distributed optimization over networks. I am currently interested in integrating federated learning with foundation models, aiming to develop memory and communication-efficient FL for foundation models.
I'm a member of SIAM, MOS, INFORMS, and IEEE.
I was the treasurer of INFORMS Rutgers Student Chapter from Sep. 2022 to May 2025.
I have completed a summer internship as Givens Associates at Argonne National Laboratory, where I worked on memory and communication-efficient asynchronous federated learning.
I will join Prof. Zheng Zhang's group at the University of California, Santa Barbara, as a Postdoctoral Scholar starting July 2025. [My CV]
PhD, Industrial and Systems Engineering, Sep. 2020 - May 2025, Rutgers University, US.
MS, Applied Mathematics, Sep. 2018 - Aug. 2020, Northeastern University, US.
BS, Mathematics & Applied Mathematics, Sep. 2014 - June 2018, Jiangsu University, China.
Research keywords:

Academic Events
I will present our recent work on federated simple bilevel optimization at ICCOPT 2025 (details will be posted soon), see you in LA!
Past events
2024 INFORMS annual meeting presentation: Zeroth-order federated methods for stochastic MPECs and nondifferentiable nonconvex hierarchical optimization.
ISMP 2024 presentation: Zeroth-order federated methods for stochastic MPECs and nondifferentiable nonconvex hierarchical optimization.
NeurIPS 2023 paper presentation in poster session: Zeroth-order methods for nondifferentiable, nonconvex, and hierarchical federated optimization. [full paper] [poster & video]
2023 INFORMS Annual Meeting presentation: Randomized zeroth-order federated methods for nonsmooth nonconvex and hierarchical optimization.
SIAM Conference on Optimization (OP23) Minisymposia presentation: Randomized methods for nonsmooth and nonconvex federated optimization.
Publications
Yuyang Qiu, Kibaek Kim, and Farzad Yousefian. A randomized zeroth-order hierarchical framework for heterogeneous federated learning. (Submitted to the 64th IEEE Conference on Decision and Control (CDC 2025), under review.)
Mohammadjavad Ebrahimi, Yuyang Qiu, Shisheng Cui, and Farzad Yousefian. Regularized federated methods with universal guarantees for simple bilevel optimization. Submitted, under review.
Yuyang Qiu, Farzad Yousefian, and Brian Zhang. Iteratively regularized gradient tracking methods for optimal equilibrium seeking. IEEE Transactions on Automatic Control (submitted, under review).
Yuyang Qiu, Uday V. Shanbhag, and Farzad Yousefian. Zeroth-order federated methods for stochastic MPECs and nondifferentiable nonconvex hierarchical optimization. Mathematics of Operations Research (under first major revision).
Yuyang Qiu, Uday V. Shanbhag, and Farzad Yousefian. Zeroth-order methods for nondifferentiable, nonconvex, and hierarchical federated optimization. 37th Conference on Neural Information Processing Systems (NeurIPS 2023). [arXiv]
Publications Before 2020 👇
Qian, L., Attia, R.A., Qiu, Y., Lu, D. and Khater, M.M., 2019."The shock peakon wave solutions of the general Degasperis–Procesi equation,"
International Journal of Modern Physics B, 33(29), p.1950351.
Li, J., Qiu, Y., Lu, D., Attia, R.A. and Khater, M., 2019."Study on the solitary wave solutions of the ionic currents on microtubules equation by using the modified Khater method,"
Thermal Science, 23(Suppl. 6), pp.2053-2062.
Qian, L., Attia, R.A., Qiu, Y., Lu, D. and Khater, M.M., 2019."On Breather and Cuspon waves solutions for the generalized higher-order nonlinear Schrodinger equation with light-wave promulgation in an optical fiber,"
Comp. Meth. Sci. Eng, 1, pp.101-110.
Contact Information
WeChat: Eric-Qyy
Linkedin
Twitter/X
Please feel free to contact me through this email: yuyang.qiu(at)rutgers(dot)edu.
Some Reference Books
Optimization Theory & Algorithms 👇
Beck, A., 2017. First-order methods in optimization, Society for Industrial and Applied Mathematics.
Beck, A., 2023. Introduction to nonlinear optimization: Theory, algorithms, and applications with Python and MATLAB, 2nd edition, Society for Industrial and Applied Mathematics.
Bertsekas, D., 2016. Nonlinear Programming, 3rd edition, Athena Scientific.
Bertsekas, D., Nedic, A. and Ozdaglar, A., 2003. Convex analysis and optimization, Athena Scientific.
Boyd, S. and Vandenberghe, L., 2004. Convex optimization, Cambridge university press.
Nesterov, Y., 2018. Lectures on convex optimization, Berlin: Springer.
Nocedal, J. and Wright, S.J., 2006. Numerical optimization, 2nd edition, New York: Springer.
Ryu, E. and Yin, W., 2022. Large-scale convex optimization: algorithms & analyses via monotone operators, Cambridge university press.
...
Mathematics & Optimization 👇
Clarke, F.H., 1990. Optimization and nonsmooth analysis, Society for Industrial and Applied Mathematics.
Facchinei, F. and Pang, J.S., 2003. Finite-dimensional variational inequalities and complementarity problems, New York, NY: Springer New York.
Folland, G.B., 1999. Real analysis: modern techniques and their applications, 2nd edition, John Wiley & Sons.
Rockafellar, R.T., 1970. Convex analysis, Princeton university press.
Rockafellar, R.T. and Wets, R.J.B., 1998. Variational analysis, Springer Science & Business Media.
Ross, S.M., 2019. Introduction to probability models, 12th edition, Academic press.
...
Machine Learning & Data Analysis & Optimization 👇
Deisenroth, M.P., Faisal, A.A. and Ong, C.S., 2020. Mathematics for Machine Learning, Cambridge University Press.
Géron, A., 2022. Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow, 3rd edition, O'Reilly Media, Inc..
Goodfellow, I., Bengio, Y. and Courville, A., 2016. Deep Learning, MIT press.
Murphy, K.P., 2022. Probabilistic Machine Learning: An Introduction, MIT press.
Prince, S.J.D., 2023. Understand Deep Learning, MIT press.
Wright, S.J. and Recht, B., 2022. Optimization for Data Analysis, Cambridge University Press.