Luhuan Wu

I am a second-year PhD student in Statistics at Columbia University, where I am very fortunate to work with Prof. John Cunningham. and Prof. David Blei. Previously, I received my master's degree in Data Science at Columbia University, supervised by Prof. John Cunningham and Prof. Itsik Pe'er. I received my bachelor's degree in Mathematics at Nanjing University.

Email  /  CV  /  Google Scholar  /  Github

profile photo

I am broadly interested in probabilistic machine learning including deep generative models, Gaussian Processes, Bayesian learning, variational inference, etc. Below are my publications.

Bias-free scalable Gaussian processes via randomized truncations
Andres Potapczynski*, Luhuan Wu*, Dan Biderman*, Geoff Pleiss, John P. Cunningham
ICML, 2021   (Oral Presentation)
code / arXiv / talk / slides

We find that early-trunctated conjugate gradients tends to underfit while random Fourier features tends to overfit in Gaussian processes learning. We address these issues using randomized truncation estimators that eliminate bias in exchange for increased variance.

Hierarchical Inducing Point Gaussian Process for Inter-domian Observations
Luhuan Wu*, Andrew Miller*, Lauren Anderson, Geoff Pleiss, David Blei, John P. Cunningham
AISTATS, 2021  
code / arXiv

We propose HIP-GP, an inter-domain GP inference method that scales to millions of inducing points. HIP-GP relies on gridded inducing points and stationary kernel assumptions, and is suitable for low-dimensional problems.

Variational Objectives for Markovian Dynamics with Backward Simulation
Antonio Khalil Moretti*, Zizhao Wang*, Luhuan Wu*, Iddo Drori, Itsik Pe’er,
ECAI, 2020   (Oral Presentation)
code / paper

We propose Particle Smoothing Variational Objectives (SVO), a novel backward simulation technique and smoothed approximate posterior defined through a subsampling process.

Inverse articulated-body dynamics from video via variational sequential Monte Carlo
Dan Biderman, Christian A Naesseth, Luhuan Wu, Taiga Abe, Alice C Mosberger, Leslie J Sibener, Rui Costa, James Murray, John P Cunningham
NeurIPS Workshop on Differentiable Vision, Graphics, and Physics in Machine Learning, 2020  
paper / talk

We propose a pipeline for body dynamics inference from video: we use a convolutional network to track joint positions, and embed these as the joints of a linked robotic manipulator; we develop a probabilistic physical model whose states specify second-order rigid-body dynamics and design a distributed nested SMC inference algorithm.

Smoothing nonlinear variational objectives with sequential Monte Carlo
Antonio Khalil Moretti*, Zizhao Wang*, Luhuan Wu, Itsik Pe’er
ICLR Workshop on Deep Generative Models for Highly Structured Data, 2019  
code / paper

We present a framework to develop Smoothed Variational Objectives (SVOs) that condition proposal distributions on the full time-ordered sequence of observations.

Teaching assistant
Columbia University

Probability and Statistics for Data Sicence, Fall 2021

Time Series Analysis, Summer 2021

Introduction to Probability and Statistics, Spring 2021

Introduction to Probabilistic Graphical Models, Fall 2020

Applied Machine Learning for Financial Modeling, Spring 2019

Website design from Jon Barron, source code here.