chenxwh / distilgpt2-stable-diffusion-v2

Descriptive stable diffusion prompts generation using GPT2

  • Public
  • 571 runs

Readme

weights from: https://huggingface.co/FredZhang7/distilgpt2-stable-diffusion-v2 cog implementation: https://github.com/chenxwh/cog-themed-diffusion/tree/distilgpt2

DistilGPT2 Stable Diffusion V2

This model was trained on 2,470,000 descriptive stable diffusion prompts on the FredZhang7/distilgpt2-stable-diffusion checkpoint for another 4,270,000 steps.

Compared to other prompt generation models using GPT2, this one runs with 50% faster forwardpropagation and 40% less disk space & RAM.

Major improvements from v1 are: - 25% more variations - faster and more fluent prompt generation - cleaned training data * removed prompts that generate images with nsfw scores > 0.5 * removed duplicates, including prompts that differ by capitalization and punctuations * removed punctuations at random places * removed prompts shorter than 15 characters