Readme
Trajectory Consistency Distillation
Introduction
TCD, inspired by Consistency Models, is a novel distillation technology that enables the distillation of knowledge from pre-trained diffusion models into a few-step sampler. In this repository, we release the inference code and our model named TCD-SDXL, which is distilled from SDXL Base 1.0. We provide the LoRA checkpoint in this 🔥repository.
⭐ TCD has following advantages:
Flexible NFEs
: For TCD, the NFEs can be varied at will (compared with SDXL Turbo), without adversely affecting the quality of the results (compared with LCM), where LCM experiences a notable decline in quality at high NFEs.Better than Teacher
: TCD maintains superior generative quality at high NFEs, even exceeding the performance of DPM-Solver++(2S) with origin SDXL. It is worth noting that there is no additional discriminator or LPIPS supervision included during training.Freely Change the Detailing
: During inference, the level of detail in the image can be simply modified by adjusing one hyper-parameter gamma. This option does not require the introduction of any additional parameters.Versatility
: Integrated with LoRA technology, TCD can be directly applied to various models (including the custom Community Models, styled LoRA, ControlNet, IP-Adapter) that share the same backbone, as demonstrated in the Usage.Avoiding Mode Collapse
: TCD achieves few-step generation without the need for adversarial training, thus circumventing mode collapse caused by the GAN objective. In contrast to the concurrent work SDXL-Lightning, which relies on Adversarial Diffusion Distillation, TCD can synthesize results that are more realistic and slightly more diverse, without the presence of “Janus” artifacts.
For more information, please refer to our paper Trajectory Consistency Distillation.
Citation
@misc{zheng2024trajectory,
title={Trajectory Consistency Distillation},
author={Jianbin Zheng and Minghui Hu and Zhongyi Fan and Chaoyue Wang and Changxing Ding and Dacheng Tao and Tat-Jen Cham},
year={2024},
eprint={2402.19159},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
Acknowledgments
This codebase heavily relies on the 🤗Diffusers library and LCM.