Explore Docs Blog Sign in Join the waitlist

👋 Do you have a model that needs a demo? Join our waitlist to get early access. In the meantime, check out the getting started docs.


GLIDE finetuned on LAION5B, then more on curated datasets.
11,398 runs

Laionide (version 3)

Direct comparison to OpenAI's model using COCO captions

Shout out to stability.ai for donating the compute to laion needed for this to be possible.

- laionide-v3-base.pt

- replicate
- colab
- locally

- comparison to openai W&B report

- You can use laionide-v2-sr.pt to upscale the outputs from laionide-v3-base.pt.
- There are watermarks in some outputs. You can try to prompt engineer this away, but it isn't always possible. royalty free seems to work well.

Training details:
- finetuned laionide-v2-base.pt for 9 epochs on a subset of CC12M (~1.5 million pairs), COCO (~100K pairs), virtual genome (~100K pairs), and open images localized annotations (~800K pairs).
- 20% of unconditional/empty token, per the paper.

Replicate Reproducible machine learning