laion-ai / laionide-v3

GLIDE finetuned on LAION5B, then more on curated datasets.

  • Public
  • 62K runs
  • GitHub
  • Paper
  • License



Run time and cost

This model runs on Nvidia T4 GPU hardware. Predictions typically complete within 4 minutes. The predict time for this model varies significantly based on the inputs.


Laionide (version 3)

Direct comparison to OpenAI’s model using COCO captions

Shout out to for donating the compute to laion needed for this to be possible.

Files: -

Inference: - replicate - colab - locally

Results: - comparison to openai W&B report

Notes: - You can use to upscale the outputs from - There are watermarks in some outputs. You can try to prompt engineer this away, but it isn’t always possible. royalty free seems to work well.

Training details: - finetuned for 9 epochs on a subset of CC12M (~1.5 million pairs), COCO (~100K pairs), virtual genome (~100K pairs), and open images localized annotations (~800K pairs). - 20% of unconditional/empty token, per the paper.