Preprints & Working Papers

Calibrated Model Criticism Using Split Predictive Checks

arXiv:2203.15897 [stat.ME], 2022.

Preprint PDF

Robust, Automated, and Accurate Black-box Variational Inference

arXiv:2203.15945 [stat.ML], 2022.

Preprint PDF

Statistical Inference with Stochastic Gradient Algorithms

2021.

PDF

Independent finite approximations for Bayesian nonparametric inference

arXiv:2009.10780 [stat.ME], 2020.

Preprint PDF

Robust Inference and Model Criticism Using Bagged Posteriors

arXiv:1912.07104 [stat.ME], 2019.

Preprint PDF

Publications

More Publications

Reproducible Model Selection Using Bagged Posteriors

Bayesian Analysis, 2022+.

Preprint PDF

The Mutational Signature Comprehensive Analysis Toolkit (musicatk) for the Discovery, Prediction, and Exploration of Mutational Signatures

Cancer Research 81(23), 2021.

PDF

Challenges and Opportunities in High-dimensional Variational Inference

In Proc. of the 35th Annual Conference on Neural Information Processing Systems (NeurIPS), 2021.

Preprint PDF

Bidirectional contact tracing could dramatically improve COVID-19 control

Nature Communications 12(232), 2021.

PDF Code

Robust, Accurate Stochastic Optimization for Variational Inference

In Proc. of the 34th Annual Conference on Neural Information Processing Systems (NeurIPS), 2020.

Preprint PDF

Validated Variational Inference via Practical Posterior Error Bounds

In Proc. of the 23rd International Conference on Artificial Intelligence and Statistics (AISTATS), Palermo, Italy. PMLR: Volume 108, 2020.

Preprint PDF Code Video

LR-GLM: high-dimensional Bayesian inference using low-rank data approximations

In Proc. of the 36th International Conference on Machine Learning (ICML), Long Beach, California. PMLR: Volume 97, 2019.

Preprint PDF

The kernel interaction trick: fast Bayesian discovery of pairwise interactions in high dimensions

In Proc. of the 36th International Conference on Machine Learning (ICML), Long Beach, California. PMLR: Volume 97, 2019.

Preprint PDF Code

Scalable Gaussian process inference with finite-data mean and variance guarantees

In Proc. of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), Naha, Okinawa, Japan. PMLR: Volume 89, 2019.

Preprint PDF Code

Data-dependent compression of random features for large-scale kernel approximation

In Proc. of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), Naha, Okinawa, Japan. PMLR: Volume 89, 2019.

Preprint PDF

Thesis

Scaling Bayesian inference: theoretical foundations and practical methods

Ph.D. thesis, Massachusetts Institute of Technology, 2018.

PDF

Miscellanea

The feasibility of targeted test-trace-isolate for the control of SARS-CoV-2 variants

F1000Research 10:291, 2021.

Preprint

Reconstructing probabilistic trees of cellular differentiation from single-cell RNA-seq data

arXiv:1811.11790 [q-bio.QM], 2018.

Preprint PDF

Practical bounds on the error of Bayesian posterior approximations: A nonasymptotic approach

arXiv:1809.09505 [stat.TH], 2018.

Preprint PDF

Detailed Derivations of Small-variance Asymptotics for some Hierarchical Bayesian Nonparametric Models

arXiv:1501.00052 [stat.ML], 2014.

Preprint PDF

Infinite Structured Hidden Semi-Markov Models

arXiv:1407.0044 [stat.ME], 2014.

Preprint PDF

Recent & Upcoming Talks

More Talks

Algorithmically robust, general-purpose variational inference
Apr 13, 2022
Statistically robust inference with stochastic gradient algorithms
Dec 14, 2021
Algorithmically robust, general-purpose variational inference
Apr 5, 2021
Algorithmically robust, general-purpose variational inference
Mar 17, 2021
Using bagged posteriors for robust inference
Mar 5, 2021
Algorithmically robust, general-purpose variational inference
Mar 4, 2021

Short Bio

Jonathan Huggins is an Assistant Professor in the Department of Mathematics & Statistics, a Data Science Faculty Fellow, and a Founding Member of the Faculty of Computing & Data Sciences at Boston University. He is also an affiliated faculty member of the BU Program in Bioinformatics. Prior to joining BU, he was a Postdoctoral Research Fellow in the Department of Biostatistics at Harvard. He completed his Ph.D. in Computer Science at the Massachusetts Institute of Technology in 2018. Previously, he received a B.A. in Mathematics from Columbia University and an S.M. in Computer Science from the Massachusetts Institute of Technology. His research centers on the development of fast, trustworthy machine learning and AI methods that balance the need for computational efficiency and the desire for statistical optimality with the inherent imperfections that come from real-world problems, large datasets, and complex models. His current applied work is focused on methods to enable more effective scientific discovery from high-throughput and multi-modal genomic data. 

Contact

  • huggins -at- bu -dot- edu
  • MCS 233E, 111 Cummington Mall, Boston MA 02215, USA
  • Monday 4:00-5:00PM, Thursday 12:00-1:00PM, or email for appointment