jyoung105 / tcd-sdxl

Trajectory Consistency Distillation: Improved Latent Consistency Distillation by Semi-Linear Consistency Function with Trajectory Mapping

  • Public
  • 16 runs
  • GitHub
  • Weights
  • Paper

Trajectory Consistency Distillation

Official Model Repo of the paper: Trajectory Consistency Distillation. For more information, please check the GitHub Repo and Project Page.

Also welcome to try the demo host on 🤗 Space.

Model Descriptions:

TCD, inspired by Consistency Models, is a novel distillation technology that enables the distillation of knowledge from pre-trained diffusion models into a few-step sampler. In this repository, we release the inference code and our model named TCD-SDXL, which is distilled from SDXL Base 1.0.

✨TCD has following advantages:

  • Flexible NFEs: For TCD, the NFEs can be varied at will (compared with Turbo), without adversely affecting the quality of the results (compared with LCMs), where LCM experiences a notable decline in quality at high NFEs.

  • Better than Teacher: TCD maintains superior generative quality at high NFEs, even exceeding the performance of DPM-Solver++(2S) with origin SDXL. It is worth noting that there is no additional discriminator or LPIPS supervision included during training.

  • Freely Change the Detailing: During inference, the level of detail in the image can be simply modified by adjusting one hyper-parameter gamma. This option does not require the introduction of any additional parameters.

  • Versatility: Integrated with LoRA technology, TCD can be directly applied to various models (including the custom Community Models, styled LoRA, ControlNet, IP-Adapter) that share the same backbone.

  • Avoiding Mode Collapse: TCD achieves few-step generation without the need for adversarial training, thus circumventing mode collapse caused by the GAN objective. In contrast to the concurrent work SDXL-Lightning, which relies on Adversarial Diffusion Distillation, TCD can synthesize results that are more realistic and slightly more diverse, without the presence of “Janus” artifacts.

For more information, please refer to our paper Trajectory Consistency Distillation.

BibTeX

@misc{2402.19159,
Author = {Jianbin Zheng and Minghui Hu and Zhongyi Fan and Chaoyue Wang and Changxing Ding and Dacheng Tao and Tat-Jen Cham},
Title = {Trajectory Consistency Distillation: Improved Latent Consistency Distillation by Semi-Linear Consistency Function with Trajectory Mapping},
Year = {2024},
Eprint = {arXiv:2402.19159},
}