I’m a PhD Student in EECS at UC Berkeley advised by Joseph Gonzalez and Sanjit Seshia. My current work focuses on the use of Machine Learning for repetitive optimization found in software systems and in logistics. I’m particularly interested in the automatic detection and exploitation of approximate symmetries in the design of data-driven optimizers. I collaborate closely with Yuandong Tian at FAIR on this pursuit.

In 2019-2020, I served as a research assistant in the Computer Science department at Columbia University working under Ronghui Gu and Suman Jana. Specifically, my focus was in designing machine learning techniques for automating deductive program verification in hopes of scaling the benefits of formal methods to large scale software systems. I previously graduated with a BS from the same department in 2019 and graduated high school from Germantown Academy in the suburbs of Philadelphia.

Teaching I most recently served as a Graduate Student Instructor for CS164 at UC Berkeley the first offering of Programming Languages and Compilers by Sarah Chasins I have been a teaching assistant in Columbia CS department from 2017 to 2019. I began as an Operating Systems TA (2017) manning the File Systems assignment. I have particularly enjoyed serving as a TA for Programing Languages and Translator Fall 2018 and Spring 2019 (Head TA). During my stint as a Head TA, I wrote an assignment from scratch to introduce students to OCaml and have lead 16 teams of students in designing their own domain specific language. I have additionally TA’d Computer Science Theory (2018) and served as a computer science and organic chemistry tutor for disadvantaged students in the center of academic advising. I have also prepared several lectures for the undergraduate math society and MIT Splash.

Current Projects:

  • Ashera :: We built an Optimization Modulo Theory solver targeting problems such as multi-agent Traveling Salesman Problem and multi-resource task DAG scheduling. (preprint)
  • Co-creativity Companion for Writing Novel Plots :: In collaboration with Meta, we designed a Language Model (LM) companion that provides creative suggestions for long-form story writing. In our framework, we finetune a diversity-dedicated suggestion model which produces suggestion useful for both automatic editing with a general-purpose language model or a human author.
  • Automated Refactoring :: We are developing a framework for refactoring code. We are specifically focused on the technical challenges of incomplete natural language specifications provided as comments or docstrings, and automatically inferring the implicit contract between helper functions and the function caller.

Past Projects:

  • Dynamic Periodic DAG Scheduling :: [Continued by Sukrit Kalra and Tiemo Bang] We looked to design a speculative scheduling system that can schedule periodic DAG workloads. However, departing from existing art, we sought to minimize the underutilization required to guarantee deadlines are met if an upstream task completes ahead of schedule.
  • Statistical Model Checking for Autonomous System Safety :: [Continued by Beyazit Yalcinkaya] We applied statistical model checking and importance sampling to test to evaluate system design choices (e.g. scheduling policy) on end-to-end crash safety.
  • Time Series Transformers :: [Paused] We looked to build off of PatchTST to explore a notion of position encoding that captures the expectation for future reward.

Publication:

Student Forum:

Personal:

  • When I'm not busy climbing the social ladder, I enjoy sports climbing, bouldering, and skiing. I am a proud member of Noteworthy A Capella at UC Berkeley. Check out our most recent recording: