Reese Pathak
Reese Pathak photo Ph.D. Student
UC Berkeley, Dept. of EECS

I am a Ph.D. student in Computer Science from the Department of Electrical Engineering and Computer Sciences (EECS) at UC Berkeley. I am advised by Michael I. Jordan and Martin J. Wainwright. Before that, I completed my undergraduate studies at Stanford University.

I will join the Department of Statistics at UC Berkeley as a NSF Mathematical Sciences Postdoctoral Research Fellow in July 2025; my sponsoring scientist (a.k.a. postdoctoral host) is Nikita Zhivotovskiy.

In January 2027, I will join the School of Operations Research and Information Engineering (ORIE) at Cornell University, as a tenure-track assistant professor.

My research interests are diverse and span high-dimensional statistics, optimization, and machine learning. Recently, I have been primarily been working on developing the foundations for learning under distribution shift (transfer learning), theory for nonparametric regression under general designs, and high-dimensional statistical inference more broadly.

Publications

Papers are ordered chronologically by date of initial announcement.

2025

Revisiting mean estimation over p\ell_p balls: Is the MLE optimal?
with Liviu Aolaritei, Michael I. Jordan, and Annie Ulichney.
[ arXiv ]

2024

Data-adaptive tradeoffs among multiple risks in distribution-free prediction
with Drew T. Nguyen, Anastasios N. Angelopoulos, Stephen Bates, Michael I. Jordan.
[ arXiv ]

On the design-dependent suboptimality of the Lasso
with Cong Ma.
[ arXiv ]

2023

Transformers can optimally learn regression mixture models
with Rajat Sen, Weihao Kong, and Abhimanyu Das.
[ arXiv ] [ ICLR 2024 ]

Noisy recovery from random linear observations: Sharp minimax rates under elliptical constraints
with Martin J. Wainwright and Lin Xiao.
[ arXiv ] [ Annals of Statistics ]

2022

Optimally tackling covariate shift in RKHS-based nonparametric regression
with Cong Ma and Martin J. Wainwright.
[ arXiv ] [ Annals of Statistics ]

A new similarity measure for covariate shift with applications to nonparametric regression
with Cong Ma and Martin J. Wainwright.
[ arXiv ] [ ICML 2022 (long oral) | slides | poster ]

2021

Cluster-and-Conquer: A Framework For Time-Series Forecasting
with Rajat Sen, Nikhil Rao, N. Benjamin Erichson, Michael I. Jordan, and Inderjit S. Dhillon.
[ arXiv ]

2020

Weighted matrix completion from non-random, non-uniform sampling patterns
with Simon Foucart, Deanna Needell, Yaniv Plan, and Mary Wootters.
[ arXiv ] [ IEEE Transactions on Information Theory ]

FedSplit: an algorithmic framework for fast federated optimization
with Martin J. Wainwright.
[ arXiv ] [ NeurIPS 2020 ]

On identifying and mitigating bias in the estimation of the COVID-19 case fatality rate
with Anastasios Angelopoulos, Rohit Varma, and Michael I. Jordan.
[ arXiv ] [ Harvard Data Science Review ] [ code ]


Teaching

Previous teaching experience is listed below. Note that the links, when available, refer to current course offerings. I use the term TA to indicate the role of a teaching assistant (formally referred to as GSI at UC Berkeley).

UC Berkeley

Stanford University


Contact

pathakr@berkeley.edu
http://www.cs.berkeley.edu/~pathakr/