Nilesh Tripuraneni
I'm a final-year Ph.D. student at U.C. Berkeley working with Michael Jordan. I'm broadly interested in machine learning, statistics and applications of machine learning in domains such as chemistry and biology. Recently, I've been interested in machine learning in settings that move beyond ``nice" i.i.d. data -- including problems related to covariate shift, transfer learning, and heavy-tailed estimation. Before coming to Berkeley, I received an M.Phil in Information Engineering from the University of Cambridge where I was advised by Zoubin Ghahramani and a B.A. in Physics from Harvard University. During my Ph.D I have been fortunate to spend time at Microsoft Research New England, Google Brain, Amazon SCOT, and Dyno Therapeutics.
Email: nilesh_tripuraneni AT berkeley DOT edu
Links: Google Scholar.
CV: CV
Publications
- Joint Representation Training in Sequential Tasks with Shared Structure. In Submission
- A Framework for the Meta-Analysis of Randomized Experiments with Applications to Heavy-Tailed Response Data. arXiv preprint. [arXiv]
- Covariate Shift in High-Dimensional Random Feature Regression. *Equal contribution. arXiv preprint. [arXiv]
- Overparameterization Improves Robustness to Covariate Shift in High Dimensions. *Equal contribution. NeurIPS 2021. [arXiv]
- Parallelizing Contextual Linear Bandits. *Equal contribution. arXiv preprint. [arXiv]
- Optimal Mean Estimation without a Variance. arXiv preprint. [arXiv]Extended Abstract to appear at COLT 2022.
- Optimal Robust Linear Regression in Nearly Linear Time. arXiv preprint. [arXiv]
- On the Theory of Transfer Learning: The Importance of Task Diversity. NeurIPS 2020. [arXiv]
- Provable Meta-Learning of Linear Representations. ICML 2021. [arXiv]
- Algorithms for Heavy-Tailed Statistics: Regression, Covariance Estimation, and Beyond. STOC 2020. [arXiv]
- Single Point Transductive Prediction. ICML 2020. [arXiv]
- Rao-Blackwellized Stochastic Gradients for Discrete Distributions. ICML 2019. [arXiv]
- Averaging Stochastic Gradient Descent on Riemannian Manifolds. COLT 2018. [arXiv]
- Stochastic Cubic Regularization for Fast Nonconvex Optimization. *Equal contribution. NIPS 2018 (Oral) [arXiv]
- Magnetic Hamiltonian Monte Carlo. ICML 2017. [arXiv]
- Lost Relatives of the Gumbel Trick. ICML 2017 (best paper honorable mention award). [arXiv]
- Quantitative criticism of literary relationships.
- Particle Gibbs for Infinite Hidden Markov Models. *Equal contribution. NIPS 2015 (poster). [NIPS]
- Bulk viscosity and cavitation in boost-invariant hydrodynamic expansion. JHEP 2010. [JHEP]