Vivak Patel

Ph.D. Candidate, Statistics

The sections below contain information about my research and academic career. If you want to get in touch, email me. Thank you for visiting!


In general, my research is on numerical optimization problems arising in statistical inference. Specifically, I focus on creating, analyzing and implementing optimization algorithms when the size of the data set or dimension of the problem are so large that classical optimization methods are inefficient or inapplicable.


Structured Identifiability of Dynamical Systems under Partial Observability and Random Driving Inputs. In progress. identifiability dynamical systems power systems

A Statistical Approach to Dynamic Load Modelling and Identification with High Frequency Measurements. In progress. statistical estimation power systems

Kalman-based Stochastic Gradient Method for Generalized Linear Models. In progress. statistical estimation

On Why SGD Fails in Practice: Stalling, Conditioning, Divergence, and Non-convex Objectives. In progress. optimization machine learning (arxiv)

Patel, V. Kalman-based Stochastic Gradient Method with Stop Condition and Insensitivity to Conditioning. SIAM Journal on Optimization 2016. optimization machine learning statistical estimation (arXiv, doi, abstract)


Patel, V. Statistical Optimization: Direct, Stochastic Analogues to Deterministic Optimization Methods. Submitted 2017. optimization machine learning (abstract)

Patel, V. A Joint Statistical and Optimization Framework for Stochastic Incremental Optimization. Submitted 2017. optimization machine learning (abstract)

Maldonado, D.A., Patel, V., Anitescu, M., Flueck, A. A Statistical Approach to Dynamic Load Modelling and Identification with High Frequency Measurements. Accepted to Power & Energy Society General Meeting 2017. statistical estimation power systems (preprint, abstract)


Patel, V. A Statistical Theory of the Kalman Filter. SIAM Uncertainty Quantification, April 8, 2016. statistical estimation (html, pdf)

Patel, V. Static Parameter Estimation using Kalman Filtering and Proximal Operators. Argonne National Labs, December 2, 2015. optimization statistical estimation (html, pdf)


Source: kSGD.R
Documentation: Coming soon.
Description: A simple implementation of Stochastic Gradient Descent (SGD) and Kalman-based Stochastic Gradient Descent (kSGD) for the R Language on both regular and large data sets. For working with large data sets, the implementation depends on the bit and ffbase packages.
Nota bene: This is not the fastest implementation of the kSGD algorithm given that it is written entirely in R. I am working on a C version with an R interface to improve calculation speed.




Lecturer. In Winter 2015, I taught a section of Statistical Models and Methods. Here are a sample of my lecture notes and slide decks (tar).

Teaching Assistant. I have assisted in teaching a number of undergraduate and graduate courses: Elementary Statistics, Numerical Linear Algebra, Sample Surveys, and Nonparametric Inference.


Data Intensive Computing Reading Group. In Autumn 2015, I started a reading group around the topic of data intensive computing systems. Here is my original reading list. If you are interested in joining, subscribe here.

Student Representative. From October 2014 to September 2015, I served as the Student Representative for the Department of Statistics to the Dean's Student Advisory Committee. In this capacity, I also represented student interests to the Statistics faculty.

PSD Co-Organizer. During the 2014 to 2015 academic year, I helped start and organize a series of graduate student lectures to encourage interdisciplinary conversations between the departments in the Physical Sciences Division.


SIAM Travel Award. Awarded to travel to SIAM UQ 2016 in Lausanne, Switzerland.



These are some notes of mine from lectures, courses and books on certain topics. If you find errata, please email me. Also, there are missing sections which I plan on completing over time.

Mathematics Reading List

Here is a list of books that I highly recommend or intend to read. If you have any additional recommendations, please get in touch. Also, I really appreciate the Chicago undergraduate mathematics bibliography.