Nhat Ho

Coming Soon! 

Postdoctoral Fellow
Department of Electrical Engineering & Computer Sciences
University of California, Berkeley

email: minhnhat@berkeley.edu
office: https://rise.cs.berkeley.edu/, Rise Lab, Berkeley
http://bliss.eecs.berkeley.edu/, Bliss Lab, Berkeley

Brief Biography

I am currently a postdoctoral fellow in the Electrical Engineering and Computer Science (EECS) Department where I am very fortunate to be mentored by Professor Michael I. Jordan and Professor Martin J. Wainwright. Before going to Berkeley, I finished my Phd degree in 2017 at the Department of Statistics, University of Michigan, Ann Arbor where I am very fortunate to be advised by Professor Long Nguyen and Professor Ya'acov Ritov.

For a recent version of my CV, please email me via minhnhat@berkeley.edu.

Research interests:

  • Bayesian nonparametrics

  • Mixture and hierarchical models, graphical models

  • (Non)Convex optimization

  • Deep learning

  • Approximate Bayesian inference

  • Statistical learning theory

  • Distributed computing

  • Sampling and Markov Chains

  • Reinforcement learning

  • Causal inference

  • Multiple testing hypotheses

  • Robust statistics

Brief description of my research:

At the moment, I am interested in exploring several directions from the interplay between computation and theory in machine learning and statistics. Some of these directions can be summarized as follows:

  • Unified frameworks for understanding singularity structures with statistical inference of popular estimators (e.g., MLE) in mixture and hierarchical models under the perspective of algebraic geometry, partial differential equation, and optimal transport theory.

  • Computational trade-offs between statistical efficiency and computational complexity/optimization in mixture and hierarchical models under singular and misspecified settings.

  • Efficient deep generative models for understanding and improving convolutional neural networks (CNN) and beyond.

  • Scalable models to study complex multi-level structures data based on the idea from optimal transport theory.

  • Efficient large scale asynchronous distributed computation with optimal transport and its barycenter.

  • Efficient optimization method for trend filtering and convex clustering problems.

  • Statistical challenges of utilizing mixture and hierarchical models in principal stratification.

  • Unified framework for achieving the uniform confidence interval bounds in partial identification.