Peter Jin


I’m a Computer Science PhD student in the ASPIRE, BAIR, BDD labs, where I’m part of Kurt Keutzer’s group. My current work is on stochastic optimization methods for deep learning, including parallel/distributed learning and new algorithms. I’m also interested in all things related to Monte Carlo tree search and GPUs.

I received my AB in Physics from Princeton University in 2012.


Spatially Parallel Convolutions

Peter Jin, Boris Ginsburg, and Kurt Keutzer
ICLR 2018 Workshop Track

Shift: A Zero FLOP, Zero Parameter Alternative to Spatial Convolutions

Bichen Wu, Alvin Wan, Xiangyu Yue, Peter Jin, Sicheng Zhao, Noah Golmant, Amir Gholaminejad, Joseph Gonzalez, and Kurt Keutzer
CVPR 2018

Regret Minimization for Partially Observable Deep Reinforcement Learning

Peter Jin, Sergey Levine, and Kurt Keutzer
NIPS 2017 Deep RL Symposium, ICLR 2018 Workshop Track

SqueezeDet: Unified, Small, Low Power Fully Convolutional Neural Networks for Real-Time Object Detection for Autonomous Driving

Bichen Wu, Forrest Iandola, Peter Jin, and Kurt Keutzer
CVPR Embedded Vision Workshop 2017

How to scale distributed deep learning? [a.k.a. the gossiping SGD paper]

Peter Jin, Qiaochu Yuan, Forrest Iandola, and Kurt Keutzer
NIPS ML Systems Workshop 2016

Convolutional Monte Carlo Rollouts in Go

Peter Jin and Kurt Keutzer
CG 2016 Neural Networks Workshop