Readme
Scores Based Generative Modeling with Critically Damped Langevin Diffusion
Inspired by statistical mechanics, we propose a novel forward diffusion process, the critically-damped Langevin diffusion (CLD). In CLD, the data variable is augmented with an additional “velocity” variable and a diffusion process is run in the joint data-velocity space. Data and velocity are coupled to each other as in Hamiltonian dynamics, and noise is injected only into the velocity variable. Similarly as in Hamiltonian Monte Carlo, the Hamiltonian component helps to efficiently traverse the joint data-velocity space and to transform the data distribution into the prior distribution more smoothly. We derive the corresponding score matching objective and show that for CLD the neural network is tasked with learning only the score of the conditional distribution of velocity given data, which is arguably easier than learning the score of the diffused data distribution directly. Using techniques from molecular dynamics, we also derive a novel SDE integrator tailored to CLD’s reverse-time synthesis SDE.
Citation
If you find the code useful for your research, please consider citing our ICLR paper:
@inproceedings{dockhorn2022scorebased,
title={{Score-Based Generative Modeling with Critically-Damped Langevin Diffusion}},
author={Tim Dockhorn and Arash Vahdat and Karsten Kreis},
booktitle={International Conference on Learning Representations},
year={2022},
}