John Miller

John Miller 

John Miller
Department of Electrical Engineering and Computer Sciences
University of California, Berkeley
Github / Google Scholar

About Me

I am a first year PhD student in the EECS department at UC Berkeley, co-advised by Mortiz Hardt and Ben Recht. I am supported by a Berkeley Fellowship. From 2016-2017, I was a research scientist in Baidu's Silicon Valley AI Lab. Before that, I received a BS in Computer Science and an MS in Electrical Engineering from Stanford University, where I had the privilege of working with Percy Liang and Tim Roughgarden.

Research Interests

  • Machine learning and optimization

  • Deep learning and generative modeling

  • Natural language processing


(asterisk indicates joint or alphabetical authorship)

  • Deep Voice 3: 2000-Speaker Neural Text-to-Speech. Wei Ping*, Kainan Peng*, Andrew Gibiansky*, Sercan Arik*, Ajay Kannan*, Sharan Narang*, Jonathan Raiman*, John Miller*. International Conference on Learning Representations (ICLR). 2018.

  • Deep Voice 2: Multi-Speaker Neural Text-to-Speech. Sercan Arik*, Gregory Diamos*, Andrew Gibiansky*, John Miller*, Kainan Peng*, Wei Ping*, Jonathan Raiman*, and Yanqi Zhou*. Advances in Neural Information Processing Systems (NIPS), 2017.

  • Globally Normalized Reader. Jonathan Raiman and John Miller. Empirical Methods in Natural Language Processing (EMNLP), 2017. (code)

  • Deep Voice: Real-time Neural Text-to-Speech. Sercan Arik*, Mike Chrzanowski*, Adam Coates*, Gregory Diamos*, Andrew Gibiansky*, Yongguo Kang*, Xian Li*, John Miller*, Andrew Ng*, Jonathan Raiman*, Shubho Sengupta*, and Mohammad Shoeybi*. International Conference on Machine Learning (ICML), 2017.

  • Traversing Knowledge Graphs in Vector Space. Kelvin Guu, John Miller, and Percy Liang. Empirical Methods in Natural Language Processing (EMNLP), 2015. Best paper honorable mention. (code)


  • I wrote CVXCanon, a package for canonicalization of convex programs that's used in CVXPY and CVXR