INVITED TALKS
- On the Convergence to a Global Solution of Shuffling-Type Gradient Algorithms. INFORMS Annual Meeting, Phoenix, AZ, October 2023
- New Perspective On The Convergence To A Global Solution Of Finite-sum Optimization. INFORMS Annual Meeting, Indianapolis, IN, October 2022
- Nesterov Accelerated Shuffling Gradient Method for Convex Optimization. Johns Hopkins University, Baltimore, MD, September 2022
- Hogwild! Over Distributed Local Data Sets With Linearly Increasing Mini-batch Sizes. INFORMS Annual Meeting, Anaheim, CA, October 2021
- A Unified Convergence Analysis for Shuffling-Type Gradient Methods. INFORMS Annual Meeting, National Harbor, MD, November 2020
- Finite-Sum Smooth Optimization with SARAH. INFORMS Annual Meeting, Seattle, WA, October 2019
- Inexact SARAH for Solving Stochastic Optimization Problems. INFORMS Annual Meeting, Phoenix, AZ, November 2018 [SLIDES]
- Inexact SARAH for Solving Stochastic Optimization Problems. DIMACS/TRIPODS/MOPTA, Bethlehem, PA, August 2018
- When does stochastic gradient algorithm work well? INFORMS Optimization Society Conference, Denver, CO, March 2018
- SARAH: Stochastic recursive gradient algorithm. INFORMS Annual Meeting, Houston, TX, October 2017 [SLIDES]
- SARAH algorithm. IBM Thomas J. Watson Research Center, Yorktown Heights, NY, August 2017
- A queueing system with on-demand servers: local stability of fluid limits. INFORMS Annual Meeting, Nashville, TN, November 2016 [SLIDES]
- A queueing system with on-demand servers: local stability of fluid limits. Modeling and Optimization: Theory and Applications, Bethlehem, PA, August 2016