Beyond Research

Hobbies

I like running, and biking around in my free time besides watching movies and following politics. After coming to US, the latest of these got a boost in its time share as now I have to keep a track of the largest and the oldest democracies at the same time! My tryst with running has been a recent one. Last October, I ran a half-marathon on the beautiful Fire Trails in the Berkeley hills (1:45, 1200 feet, for those interested in numbers). This run happened during my preparation for the Golden-Gate half marathon (GGHM). Unfortunately, I could not run the actual GGHM due to an injury that I got during this run. Nonetheless, I enjoyed my practice sessions and the journey towards running one. In the past, I enjoyed hiking for two semesters as one of my hobbies. I used to love Cricket but have been losing interest in it with the retirement of my childhood favorites. I also like interacting with people, and I take out time to talk and discuss things specially with my juniors and have had some success mentoring a few.

Blog

I started writing a blog to share my experiences with changing phases of my life. The posts have been mainly informative in nature, where I try to provide as much relevant information as possible from my own experience. The purpose of the blog was to get away with the need of repetition of information from my side and also give the same picture of various events and preparation needed to the students. The posts can be classified in the following categories:

  • Preparation for the IIT-JEE (the entrance exam for my undergraduate college)

  • Information about Research Internship

  • Preparation and Interview Experience for the Job Market in my senior year

  • Preparation for GRE and the Apping Process for Graduate Studies (coming soon)

Talks

  1. I participated once as a panelist in Panel Discussion on “Teaching and Learning Mathematics at IIT Bombay” conducted at Institute Level where I shared the stage with a few esteemed professors from mathematics department and some other undergraduate students. Slides from my 10-minute presentation can be found here: pdf.

  2. In my junior year I presented about the importance of academics and internships, healthy interaction between students and professors to an audience of freshmen and sophomores. The video of the talk can be found here. (Disclaimer: The language is pretty informal, in fact as informal as a junior year undergraduate can get with “his juniors”, as the audience comprised of simply the undergraduate students.)

Some Other Research Work

Below, I have listed a few minor projects not directly associated with my major interests, with links to the related reports.

  • Review of Statistical Analysis of Numerical Preclinical Radio-biological Data This work reproduces tests and results presented by Pitt and Hill and discusses some other non- parametric techniques, such as Permutation Tests, which allow to analyze data with less restrictive assumptions. The focus of the review is on the statistical methodology rather than the underlying biological aspects and assumptions of the original work, which are not discussed. Although not expert in statistical methods for fraud detection, we do believe that permutation tests are promising in this context, as demonstrated by the results presented here. This review was developed as a term project for a Graduate Level Course on Statistical Models at UC Berkeley and was published at ScienceOpen (link).

  • Naturalistic Image Synthesis Using Variational Auto-Encoder We develop a deep generative model for naturalistic image synthesis using variational auto-encoders (VAE). Our model uses convolutional and fully-connected layers and includes an l2 loss on features extracted from a VGGNet that was pre-trained for classification on ImageNet dataset. Feature loss is used to enahance ‘naturality’ in the visual appeal of the images. These deviate from the traditional fully-connected models that use only pixel and latent loss for training VAEs. We show that use of convolutional layers in the model improves the performance for reconstruction and generation of images from the trained network. Although we obtain good results for MNIST hand- written digits dataset, we were unable to generate realistic images using the diverse CIFAR-10 dataset. Furthermore, we could not conclude if incorporating feature consistency in the loss function led to better results. Hence, our results deviate from the findings presented in a recent paper by Hou et al., where the authors used CelebFaces Attributes (CelebA) dataset, and showed that incorporating feature loss from a pre-trained VGGNet helped their VAE generate more realistic images compared to the existing models in the literature.

  • A Closer Look at System Identification: Review, Modifications and Comparisons System identification (ID) is one of the classical problems studied in control theory. The purpose of system ID is two folds: identify the unknown parameters that govern the system, and perform optimal control with respect to a user specified cost. In this work, we study the classical work done in the space of system ID, in particular, celebrated offline and online schemes that have been analyzed theoretically. We introduce to the readers the difference between these schemes, compare and remark the differences between the performance attained using the different schemes. To keep the discussion insightful and detailed, we specialize the analysis to simple one-dimensional linear systems when needed. To our surprise analyzing the SISO systems is still an active area of research. For some simple cases, e.g., tracking problem, we improve the existing results reported in the works we study. Furthermore, we present numerical experiments that align with the rates of convergence presented in this work.

  • Asynchronous Parallel Optimization: Stochastic Gradient Method (SGM) is a popular algorithm in the area of optimization and machine learning. With the tremendous increase in the size of the problems due to the large datasets associated, there has been a surge in efforts to parallelize this scheme. Such schemes suffer from the synchronization step, which kills the speed up gained from distributing the work. In this project, we analyze a novel asynchronous algorithm for optimization of convex functions. Our scheme is inspired by the seminal work Hogwild! by Recht B. et al, where the authors propose an asynchronous, lock-free and parallel SGM algorithm. In particular, they showed convergence and speed up results for sum of convex functions, where each function depended only on a small subset of variables. We relax the assumption of such a ‘sparse relation’ between the functions and the variables by proposing a new step in SGM. Through numerical experiments, we demonstrate the speedup of our algorithm for an ordinary least squares problem compared to serial SGM and Hogwild!. Report, Poster presented at UC Berkeley

  • Convex Relaxations of Constraint Satisfactions Problems: The aims of the report are threefold. (1) Introduce the reader to the Constraint Satisfaction Problem (CSP) framework. (2) Equip the reader with “tractable tools” from convex optimization (and randomized algorithms) to find a “good” solution. This is done in a wholesome fashion since we don’t leave for the reader to implement the schemes but in fact demonstrate them in our numerical section. (3) Introduce some of the optimal approximation schemes in literature and comment on their computational aspects. Report, Slides

  • Particle Swarm Optimization: Report