I am a Staff Research Scientist at IBM Research,
Thomas J. Watson Research Center working in the intersection of Optimization and Machine Learning / Deep Learning.
I am also a Principal Investigator of ongoing MIT-IBM Watson AI Lab projects and an IBM Master Inventor.
I proposed a new algorithm for machine learning problems called SARAH (which is named after my daughter's name Sarah H. Nguyen) for solving convex and nonconvex large scale optimization problems.
This paper is published in The 34th International Conference on Machine Learning (ICML 2017). At IBM Research, my work on "Stochastic Gradient Methods: Theory and Applications" was selected for 2021 IBM Research Accomplishments and the paper "A Hybrid Stochastic Optimization Framework for Composite Nonconvex Optimization" (SGD-SARAH) was selected as a winner of the 2022 Pat Goldberg Memorial Best Paper competition.
Reviewer: Journal of Machine Learning Research, Mathematical Programming, SIAM Journal on Optimization, SIAM Journal on Numerical Analysis, IEEE Transactions on Neural Networks and Learning Systems, IEEE Transactions on Signal Processing, Artificial Intelligence, Optimization Methods and Software, SIAM Journal on Mathematics of Data Science.