Pierre Glaser

I am a final-year Machine Learning PhD Student in the Gatsby Computational Neuroscience Unit. More about me here.



Recent Publications


ICML 2025 (ArXiv)

Efficiently Vectorized MCMC on Modern Accelerators

With the advent of automatic vectorization tools (e.g., JAX's vmap writing multi-chain MCMC algorithms is often now as simple as invoking those tools on single-chain code. Whilst convenient, for various MCMC algorithms this results in a synchronization problem ...



ICLR 2025 (bioRxiv)

SIMPL: Scalable and hassle-free optimisation of neural representations from behaviour

High-dimensional neural activity in the brain is known to encode low-dimensional, time-evolving, behaviour-related variables. A fundamental goal of neural data analysis consists of identifying such variables and their mapping to neural activity. The canonical approach is to assume ...



NeurIPS 2024 (pdf)

Near-Optimality of Contrastive Divergence Algorithms

We provide a non-asymptotic analysis of the contrastive divergence (CD) algorithm, a training method for unnormalized models. While prior work has established that (for exponential family distributions) the CD iterates asymptotically converge at an \(O(n^{-1/3})\) rate to the true parameter of the data distribution ...



JMLR 2025 (ArXiv)

(De)-regularized Maximum Mean Discrepancy Gradient Flow

We introduce a (de)-regularization of the Maximum Mean Discrepancy (DrMMD) and its Wasserstein gradient flow. Existing gradient flows that transport samples from source distribution to target distribution with only target samples, either lack tractable numerical implementations ...



ICML 2024 (pdf)

Kernel-Based Evaluation of Conditional Biological Sequence Models

We propose a set of kernel-based tools to evaluate the designs and tune the hyperparameters of conditional sequence models, with a focus on problems in computational biology. The backbone of our tools is a new measure of discrepancy ...



UAI 2023, Spotlight presentation (pdf)

Fast and Scalable Score-Based Kernel Calibration Tests

We introduce the Kernel Calibration Conditional Stein Discrepancy test (KCCSD test), a nonparametric, kernel-based test for assessing the calibration of probabilistic models with well-defined scores. In contrast to previous methods, ...



October 2022 - Under Review (ArXiv)

Learning Unnormalized Models for Simulation-Based Inference

We introduce two synthetic likelihood methods for Simulation-Based Inference (SBI), to conduct either amortized or targeted inference from experimental observations when a high-fidelity simulator is available. Both methods learn a conditional...



NeurIPS 2021 (ArXiv)

KALE Flow: A Relaxed KL Gradient Flow for Probabilities with Disjoint Support

We study the gradient flow for a relaxed approximation to the Kullback-Leibler (KL) divergence between a moving source and a fixed target distribution. This approximation, termed the KALE (KL approximate lower-bound estimator)...

News