I am a final-year (2021-2026) Ph.D. student in the CS Theory Group of the EECS Department at UC Berkeley, where I am fortunate to be advised by Avishay Tal and Jelani Nelson. The last two years of my PhD study (2024-2026) are gratefully supported by a Google PhD Fellowship.
I mainly work on the theory of Privacy-Preserving Machine Learning (PPML) and Data Analysis , mostly through the lens of Differential Privacy. My research covers both private algorithm design and lower bounds, soliciting motivation from both theory and practice. I also explore the connection between privacy and other considerations in machine learning, such as robustness, generalization, and memorization.
I also maintain a broad interests in CS theory: I have worked on a diverse set of topics, incuding pseudorandomness, computational learning theory, communication complexity and combinatorics.
Experiences
Previously, I did my undergrad at the Institute for Interdisciplinary Information Sciences (aka. Yao Class) at Tsinghua University. During the Spring and Summer of 2020, I was lucky to visit MIT and work on complexity theory with Lijie Chen and Ryan Williams.
From Summer 2022 to May 2024, I was a student researcher at Google Research (Mountain View), where I worked under the guidence of Edith Cohen, Jelani Nelson, Tamás Sarlós and Uri Stemmer.
In the Summers of 2024 and 2025, I was a machine learning research intern at Apple Machine Learning Research (MLR), where I worked under the mentorship of Kunal Talwar and Vitaly Feldman.
I am on the job market of 2025 - 2026!
Manuscripts
Ongoing projects in progress. Stay tuned!
Publications
Private Learning of Littlestone Classes, Revisited
[arXiv]
Xin Lyu
To appear in STOC 2026
Cell-Probe Lower Bounds via Semi-Random CSP Refutation: Simplified and the Odd-Locality Case
[arXiv]
Venkatesan Guruswami,
Xin Lyu,
Weiqiang Yuan
SODA 2026
More efficient sifting for grid norms, and applications to multiparty communication complexity
[arXiv]
Zander Kelley,
Xin Lyu
FOCS 2025
Trade-Offs in Data Memorization via Strong Data Processing Inequalities
[arXiv]
Guy Kornowski,
Vitaly Feldman,
Xin Lyu
COLT 2025 (also Spotlight Talk at TPDP 2025, and Best Paper Award at MemFM Workshop@ICML2025)
Fingerprinting Codes Meet Geometry: Improved Lower Bounds for Private Query Release and Adaptive Data Analysis
[arXiv]
Xin Lyu,
Kunal Talwar
STOC 2025 (also Spotlight Talk at TPDP 2025, and Non-Archival Talk at FORC 2025)
Lower Bounds for Differential Privacy Under Continual Observation and Online Threshold Queries
[arXiv]
Edith Cohen,
Xin Lyu,
Jelani Nelson,
Tamás Sarlós,
Uri Stemmer
COLT 2024 (also Spotlight Talk at TPDP 2024)
The Cost of Parallelizing Boosting
[arXiv]
Xin Lyu,
Hongxun Wu,
Junzhao Yang
SODA 2024
The Target-Charging Technique for Privacy Analysis Across Interaction Computations
[arXiv]
Edith Cohen,
Xin Lyu
NeurIPS 2023
Tight Time-Space Lower Bounds for Constant-Pass Learning
[arXiv]
Xin Lyu,
Avishay Tal,
Hongxun Wu,
Junzhao Yang
FOCS 2023
Time-Space Tradeoffs for Element Distinctness and Set Intersection via Pseudorandomness
[arXiv]
Xin Lyu, Weihao Zhu
SODA 2023
Composition Theorems for Interactive Differential Privacy
[arXiv]
Xin Lyu
NeurIPS 2023
(Note: See also this independent and concurrent work by Vadhan and Zhang.)
Range Avoidance for Low-depth Circuits and Connections to Pseudorandomness
[ECCC]
Venkatesan Guruswami,
Xin Lyu,
Xiuhan Wang
RANDOM 2022
When I am not working, I usually go cycling with a pretty Specialized bike (Tarmac SL8). Check out my unimpressive Strava page. Even more personal: I am happily married to Xinwei Li and we have a lovely child.