fmosele
/
synthiola
Synthiola is a synthetic being living in SDXL latent space: https://www.fabianmosele.com/synthiola
- Public
- 243 runs
-
L40S
Prediction
fmosele/synthiola:7e2989f9ID3khauotb5pqigym2euqwatuybyStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
- width
- 1024
- height
- 1024
- prompt
- A photo of TOK female eating a burger sloppy
- refine
- no_refiner
- scheduler
- K_EULER
- lora_scale
- 1
- num_outputs
- 1
- guidance_scale
- 7.5
- apply_watermark
- high_noise_frac
- 0.8
- prompt_strength
- 0.8
- num_inference_steps
- 50
{ "width": 1024, "height": 1024, "prompt": "A photo of TOK female eating a burger sloppy", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 }
Install Replicate’s Node.js client library:npm install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run fmosele/synthiola using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "fmosele/synthiola:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da", { input: { width: 1024, height: 1024, prompt: "A photo of TOK female eating a burger sloppy", refine: "no_refiner", scheduler: "K_EULER", lora_scale: 1, num_outputs: 1, guidance_scale: 7.5, apply_watermark: true, high_noise_frac: 0.8, prompt_strength: 0.8, num_inference_steps: 50 } } ); console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import the client:import replicate
Run fmosele/synthiola using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "fmosele/synthiola:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da", input={ "width": 1024, "height": 1024, "prompt": "A photo of TOK female eating a burger sloppy", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": True, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Run fmosele/synthiola using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da", "input": { "width": 1024, "height": 1024, "prompt": "A photo of TOK female eating a burger sloppy", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/fmosele/synthiola@sha256:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da \ -i 'width=1024' \ -i 'height=1024' \ -i 'prompt="A photo of TOK female eating a burger sloppy"' \ -i 'refine="no_refiner"' \ -i 'scheduler="K_EULER"' \ -i 'lora_scale=1' \ -i 'num_outputs=1' \ -i 'guidance_scale=7.5' \ -i 'apply_watermark=true' \ -i 'high_noise_frac=0.8' \ -i 'prompt_strength=0.8' \ -i 'num_inference_steps=50'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/fmosele/synthiola@sha256:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "width": 1024, "height": 1024, "prompt": "A photo of TOK female eating a burger sloppy", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2023-08-15T12:56:14.924436Z", "created_at": "2023-08-15T12:55:59.223502Z", "data_removed": false, "error": null, "id": "3khauotb5pqigym2euqwatuyby", "input": { "width": 1024, "height": 1024, "prompt": "A photo of TOK female eating a burger sloppy", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 }, "logs": "Using seed: 10635\nPrompt: A photo of <s0><s1> female eating a burger sloppy\ntxt2img mode\n 0%| | 0/50 [00:00<?, ?it/s]\n 2%|▏ | 1/50 [00:00<00:13, 3.71it/s]\n 4%|▍ | 2/50 [00:00<00:12, 3.70it/s]\n 6%|▌ | 3/50 [00:00<00:12, 3.69it/s]\n 8%|▊ | 4/50 [00:01<00:12, 3.67it/s]\n 10%|█ | 5/50 [00:01<00:12, 3.68it/s]\n 12%|█▏ | 6/50 [00:01<00:11, 3.69it/s]\n 14%|█▍ | 7/50 [00:01<00:11, 3.69it/s]\n 16%|█▌ | 8/50 [00:02<00:11, 3.69it/s]\n 18%|█▊ | 9/50 [00:02<00:11, 3.69it/s]\n 20%|██ | 10/50 [00:02<00:10, 3.69it/s]\n 22%|██▏ | 11/50 [00:02<00:10, 3.69it/s]\n 24%|██▍ | 12/50 [00:03<00:10, 3.69it/s]\n 26%|██▌ | 13/50 [00:03<00:10, 3.69it/s]\n 28%|██▊ | 14/50 [00:03<00:09, 3.69it/s]\n 30%|███ | 15/50 [00:04<00:09, 3.69it/s]\n 32%|███▏ | 16/50 [00:04<00:09, 3.69it/s]\n 34%|███▍ | 17/50 [00:04<00:08, 3.69it/s]\n 36%|███▌ | 18/50 [00:04<00:08, 3.69it/s]\n 38%|███▊ | 19/50 [00:05<00:08, 3.69it/s]\n 40%|████ | 20/50 [00:05<00:08, 3.69it/s]\n 42%|████▏ | 21/50 [00:05<00:07, 3.69it/s]\n 44%|████▍ | 22/50 [00:05<00:07, 3.69it/s]\n 46%|████▌ | 23/50 [00:06<00:07, 3.69it/s]\n 48%|████▊ | 24/50 [00:06<00:07, 3.68it/s]\n 50%|█████ | 25/50 [00:06<00:06, 3.69it/s]\n 52%|█████▏ | 26/50 [00:07<00:06, 3.68it/s]\n 54%|█████▍ | 27/50 [00:07<00:06, 3.68it/s]\n 56%|█████▌ | 28/50 [00:07<00:05, 3.68it/s]\n 58%|█████▊ | 29/50 [00:07<00:05, 3.68it/s]\n 60%|██████ | 30/50 [00:08<00:05, 3.68it/s]\n 62%|██████▏ | 31/50 [00:08<00:05, 3.68it/s]\n 64%|██████▍ | 32/50 [00:08<00:04, 3.68it/s]\n 66%|██████▌ | 33/50 [00:08<00:04, 3.68it/s]\n 68%|██████▊ | 34/50 [00:09<00:04, 3.68it/s]\n 70%|███████ | 35/50 [00:09<00:04, 3.68it/s]\n 72%|███████▏ | 36/50 [00:09<00:03, 3.68it/s]\n 74%|███████▍ | 37/50 [00:10<00:03, 3.68it/s]\n 76%|███████▌ | 38/50 [00:10<00:03, 3.68it/s]\n 78%|███████▊ | 39/50 [00:10<00:02, 3.68it/s]\n 80%|████████ | 40/50 [00:10<00:02, 3.68it/s]\n 82%|████████▏ | 41/50 [00:11<00:02, 3.68it/s]\n 84%|████████▍ | 42/50 [00:11<00:02, 3.68it/s]\n 86%|████████▌ | 43/50 [00:11<00:01, 3.68it/s]\n 88%|████████▊ | 44/50 [00:11<00:01, 3.68it/s]\n 90%|█████████ | 45/50 [00:12<00:01, 3.68it/s]\n 92%|█████████▏| 46/50 [00:12<00:01, 3.67it/s]\n 94%|█████████▍| 47/50 [00:12<00:00, 3.67it/s]\n 96%|█████████▌| 48/50 [00:13<00:00, 3.67it/s]\n 98%|█████████▊| 49/50 [00:13<00:00, 3.67it/s]\n100%|██████████| 50/50 [00:13<00:00, 3.67it/s]\n100%|██████████| 50/50 [00:13<00:00, 3.68it/s]", "metrics": { "predict_time": 15.697142, "total_time": 15.700934 }, "output": [ "https://replicate.delivery/pbxt/77czFskV3M58HBLS0jdBMNrCq9vwC4dosXvCe59wg8j2PJtIA/out-0.png" ], "started_at": "2023-08-15T12:55:59.227294Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/3khauotb5pqigym2euqwatuyby", "cancel": "https://api.replicate.com/v1/predictions/3khauotb5pqigym2euqwatuyby/cancel" }, "version": "7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da" }
Generated inUsing seed: 10635 Prompt: A photo of <s0><s1> female eating a burger sloppy txt2img mode 0%| | 0/50 [00:00<?, ?it/s] 2%|▏ | 1/50 [00:00<00:13, 3.71it/s] 4%|▍ | 2/50 [00:00<00:12, 3.70it/s] 6%|▌ | 3/50 [00:00<00:12, 3.69it/s] 8%|▊ | 4/50 [00:01<00:12, 3.67it/s] 10%|█ | 5/50 [00:01<00:12, 3.68it/s] 12%|█▏ | 6/50 [00:01<00:11, 3.69it/s] 14%|█▍ | 7/50 [00:01<00:11, 3.69it/s] 16%|█▌ | 8/50 [00:02<00:11, 3.69it/s] 18%|█▊ | 9/50 [00:02<00:11, 3.69it/s] 20%|██ | 10/50 [00:02<00:10, 3.69it/s] 22%|██▏ | 11/50 [00:02<00:10, 3.69it/s] 24%|██▍ | 12/50 [00:03<00:10, 3.69it/s] 26%|██▌ | 13/50 [00:03<00:10, 3.69it/s] 28%|██▊ | 14/50 [00:03<00:09, 3.69it/s] 30%|███ | 15/50 [00:04<00:09, 3.69it/s] 32%|███▏ | 16/50 [00:04<00:09, 3.69it/s] 34%|███▍ | 17/50 [00:04<00:08, 3.69it/s] 36%|███▌ | 18/50 [00:04<00:08, 3.69it/s] 38%|███▊ | 19/50 [00:05<00:08, 3.69it/s] 40%|████ | 20/50 [00:05<00:08, 3.69it/s] 42%|████▏ | 21/50 [00:05<00:07, 3.69it/s] 44%|████▍ | 22/50 [00:05<00:07, 3.69it/s] 46%|████▌ | 23/50 [00:06<00:07, 3.69it/s] 48%|████▊ | 24/50 [00:06<00:07, 3.68it/s] 50%|█████ | 25/50 [00:06<00:06, 3.69it/s] 52%|█████▏ | 26/50 [00:07<00:06, 3.68it/s] 54%|█████▍ | 27/50 [00:07<00:06, 3.68it/s] 56%|█████▌ | 28/50 [00:07<00:05, 3.68it/s] 58%|█████▊ | 29/50 [00:07<00:05, 3.68it/s] 60%|██████ | 30/50 [00:08<00:05, 3.68it/s] 62%|██████▏ | 31/50 [00:08<00:05, 3.68it/s] 64%|██████▍ | 32/50 [00:08<00:04, 3.68it/s] 66%|██████▌ | 33/50 [00:08<00:04, 3.68it/s] 68%|██████▊ | 34/50 [00:09<00:04, 3.68it/s] 70%|███████ | 35/50 [00:09<00:04, 3.68it/s] 72%|███████▏ | 36/50 [00:09<00:03, 3.68it/s] 74%|███████▍ | 37/50 [00:10<00:03, 3.68it/s] 76%|███████▌ | 38/50 [00:10<00:03, 3.68it/s] 78%|███████▊ | 39/50 [00:10<00:02, 3.68it/s] 80%|████████ | 40/50 [00:10<00:02, 3.68it/s] 82%|████████▏ | 41/50 [00:11<00:02, 3.68it/s] 84%|████████▍ | 42/50 [00:11<00:02, 3.68it/s] 86%|████████▌ | 43/50 [00:11<00:01, 3.68it/s] 88%|████████▊ | 44/50 [00:11<00:01, 3.68it/s] 90%|█████████ | 45/50 [00:12<00:01, 3.68it/s] 92%|█████████▏| 46/50 [00:12<00:01, 3.67it/s] 94%|█████████▍| 47/50 [00:12<00:00, 3.67it/s] 96%|█████████▌| 48/50 [00:13<00:00, 3.67it/s] 98%|█████████▊| 49/50 [00:13<00:00, 3.67it/s] 100%|██████████| 50/50 [00:13<00:00, 3.67it/s] 100%|██████████| 50/50 [00:13<00:00, 3.68it/s]
Prediction
fmosele/synthiola:7e2989f9IDahrcxktbqq42kusfkepptiydyiStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
- width
- 1024
- height
- 1024
- prompt
- A render of TOK from My Little Pony TV show
- refine
- no_refiner
- scheduler
- K_EULER
- lora_scale
- 1
- num_outputs
- 1
- guidance_scale
- 7.5
- apply_watermark
- high_noise_frac
- 0.8
- prompt_strength
- 0.8
- num_inference_steps
- 50
{ "width": 1024, "height": 1024, "prompt": "A render of TOK from My Little Pony TV show", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 }
Install Replicate’s Node.js client library:npm install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run fmosele/synthiola using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "fmosele/synthiola:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da", { input: { width: 1024, height: 1024, prompt: "A render of TOK from My Little Pony TV show", refine: "no_refiner", scheduler: "K_EULER", lora_scale: 1, num_outputs: 1, guidance_scale: 7.5, apply_watermark: true, high_noise_frac: 0.8, prompt_strength: 0.8, num_inference_steps: 50 } } ); console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import the client:import replicate
Run fmosele/synthiola using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "fmosele/synthiola:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da", input={ "width": 1024, "height": 1024, "prompt": "A render of TOK from My Little Pony TV show", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": True, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Run fmosele/synthiola using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da", "input": { "width": 1024, "height": 1024, "prompt": "A render of TOK from My Little Pony TV show", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/fmosele/synthiola@sha256:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da \ -i 'width=1024' \ -i 'height=1024' \ -i 'prompt="A render of TOK from My Little Pony TV show"' \ -i 'refine="no_refiner"' \ -i 'scheduler="K_EULER"' \ -i 'lora_scale=1' \ -i 'num_outputs=1' \ -i 'guidance_scale=7.5' \ -i 'apply_watermark=true' \ -i 'high_noise_frac=0.8' \ -i 'prompt_strength=0.8' \ -i 'num_inference_steps=50'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/fmosele/synthiola@sha256:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "width": 1024, "height": 1024, "prompt": "A render of TOK from My Little Pony TV show", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2023-08-15T12:58:24.601620Z", "created_at": "2023-08-15T12:58:09.130742Z", "data_removed": false, "error": null, "id": "ahrcxktbqq42kusfkepptiydyi", "input": { "width": 1024, "height": 1024, "prompt": "A render of TOK from My Little Pony TV show", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 }, "logs": "Using seed: 33499\nPrompt: A render of <s0><s1> from My Little Pony TV show\ntxt2img mode\n 0%| | 0/50 [00:00<?, ?it/s]\n 2%|▏ | 1/50 [00:00<00:13, 3.70it/s]\n 4%|▍ | 2/50 [00:00<00:13, 3.69it/s]\n 6%|▌ | 3/50 [00:00<00:12, 3.69it/s]\n 8%|▊ | 4/50 [00:01<00:12, 3.68it/s]\n 10%|█ | 5/50 [00:01<00:12, 3.69it/s]\n 12%|█▏ | 6/50 [00:01<00:11, 3.69it/s]\n 14%|█▍ | 7/50 [00:01<00:11, 3.69it/s]\n 16%|█▌ | 8/50 [00:02<00:11, 3.69it/s]\n 18%|█▊ | 9/50 [00:02<00:11, 3.69it/s]\n 20%|██ | 10/50 [00:02<00:10, 3.69it/s]\n 22%|██▏ | 11/50 [00:02<00:10, 3.69it/s]\n 24%|██▍ | 12/50 [00:03<00:10, 3.69it/s]\n 26%|██▌ | 13/50 [00:03<00:10, 3.69it/s]\n 28%|██▊ | 14/50 [00:03<00:09, 3.69it/s]\n 30%|███ | 15/50 [00:04<00:09, 3.68it/s]\n 32%|███▏ | 16/50 [00:04<00:09, 3.68it/s]\n 34%|███▍ | 17/50 [00:04<00:08, 3.68it/s]\n 36%|███▌ | 18/50 [00:04<00:08, 3.68it/s]\n 38%|███▊ | 19/50 [00:05<00:08, 3.68it/s]\n 40%|████ | 20/50 [00:05<00:08, 3.68it/s]\n 42%|████▏ | 21/50 [00:05<00:07, 3.68it/s]\n 44%|████▍ | 22/50 [00:05<00:07, 3.68it/s]\n 46%|████▌ | 23/50 [00:06<00:07, 3.68it/s]\n 48%|████▊ | 24/50 [00:06<00:07, 3.68it/s]\n 50%|█████ | 25/50 [00:06<00:06, 3.68it/s]\n 52%|█████▏ | 26/50 [00:07<00:06, 3.68it/s]\n 54%|█████▍ | 27/50 [00:07<00:06, 3.67it/s]\n 56%|█████▌ | 28/50 [00:07<00:05, 3.68it/s]\n 58%|█████▊ | 29/50 [00:07<00:05, 3.68it/s]\n 60%|██████ | 30/50 [00:08<00:05, 3.68it/s]\n 62%|██████▏ | 31/50 [00:08<00:05, 3.68it/s]\n 64%|██████▍ | 32/50 [00:08<00:04, 3.68it/s]\n 66%|██████▌ | 33/50 [00:08<00:04, 3.68it/s]\n 68%|██████▊ | 34/50 [00:09<00:04, 3.68it/s]\n 70%|███████ | 35/50 [00:09<00:04, 3.68it/s]\n 72%|███████▏ | 36/50 [00:09<00:03, 3.67it/s]\n 74%|███████▍ | 37/50 [00:10<00:03, 3.67it/s]\n 76%|███████▌ | 38/50 [00:10<00:03, 3.67it/s]\n 78%|███████▊ | 39/50 [00:10<00:02, 3.67it/s]\n 80%|████████ | 40/50 [00:10<00:02, 3.67it/s]\n 82%|████████▏ | 41/50 [00:11<00:02, 3.67it/s]\n 84%|████████▍ | 42/50 [00:11<00:02, 3.67it/s]\n 86%|████████▌ | 43/50 [00:11<00:01, 3.67it/s]\n 88%|████████▊ | 44/50 [00:11<00:01, 3.67it/s]\n 90%|█████████ | 45/50 [00:12<00:01, 3.67it/s]\n 92%|█████████▏| 46/50 [00:12<00:01, 3.67it/s]\n 94%|█████████▍| 47/50 [00:12<00:00, 3.67it/s]\n 96%|█████████▌| 48/50 [00:13<00:00, 3.67it/s]\n 98%|█████████▊| 49/50 [00:13<00:00, 3.67it/s]\n100%|██████████| 50/50 [00:13<00:00, 3.67it/s]\n100%|██████████| 50/50 [00:13<00:00, 3.68it/s]", "metrics": { "predict_time": 15.475982, "total_time": 15.470878 }, "output": [ "https://replicate.delivery/pbxt/aCifp2lXb92ady5jyAsTPxxSfUfGiadDYl3chf2tt5P9GKpFB/out-0.png" ], "started_at": "2023-08-15T12:58:09.125638Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/ahrcxktbqq42kusfkepptiydyi", "cancel": "https://api.replicate.com/v1/predictions/ahrcxktbqq42kusfkepptiydyi/cancel" }, "version": "7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da" }
Generated inUsing seed: 33499 Prompt: A render of <s0><s1> from My Little Pony TV show txt2img mode 0%| | 0/50 [00:00<?, ?it/s] 2%|▏ | 1/50 [00:00<00:13, 3.70it/s] 4%|▍ | 2/50 [00:00<00:13, 3.69it/s] 6%|▌ | 3/50 [00:00<00:12, 3.69it/s] 8%|▊ | 4/50 [00:01<00:12, 3.68it/s] 10%|█ | 5/50 [00:01<00:12, 3.69it/s] 12%|█▏ | 6/50 [00:01<00:11, 3.69it/s] 14%|█▍ | 7/50 [00:01<00:11, 3.69it/s] 16%|█▌ | 8/50 [00:02<00:11, 3.69it/s] 18%|█▊ | 9/50 [00:02<00:11, 3.69it/s] 20%|██ | 10/50 [00:02<00:10, 3.69it/s] 22%|██▏ | 11/50 [00:02<00:10, 3.69it/s] 24%|██▍ | 12/50 [00:03<00:10, 3.69it/s] 26%|██▌ | 13/50 [00:03<00:10, 3.69it/s] 28%|██▊ | 14/50 [00:03<00:09, 3.69it/s] 30%|███ | 15/50 [00:04<00:09, 3.68it/s] 32%|███▏ | 16/50 [00:04<00:09, 3.68it/s] 34%|███▍ | 17/50 [00:04<00:08, 3.68it/s] 36%|███▌ | 18/50 [00:04<00:08, 3.68it/s] 38%|███▊ | 19/50 [00:05<00:08, 3.68it/s] 40%|████ | 20/50 [00:05<00:08, 3.68it/s] 42%|████▏ | 21/50 [00:05<00:07, 3.68it/s] 44%|████▍ | 22/50 [00:05<00:07, 3.68it/s] 46%|████▌ | 23/50 [00:06<00:07, 3.68it/s] 48%|████▊ | 24/50 [00:06<00:07, 3.68it/s] 50%|█████ | 25/50 [00:06<00:06, 3.68it/s] 52%|█████▏ | 26/50 [00:07<00:06, 3.68it/s] 54%|█████▍ | 27/50 [00:07<00:06, 3.67it/s] 56%|█████▌ | 28/50 [00:07<00:05, 3.68it/s] 58%|█████▊ | 29/50 [00:07<00:05, 3.68it/s] 60%|██████ | 30/50 [00:08<00:05, 3.68it/s] 62%|██████▏ | 31/50 [00:08<00:05, 3.68it/s] 64%|██████▍ | 32/50 [00:08<00:04, 3.68it/s] 66%|██████▌ | 33/50 [00:08<00:04, 3.68it/s] 68%|██████▊ | 34/50 [00:09<00:04, 3.68it/s] 70%|███████ | 35/50 [00:09<00:04, 3.68it/s] 72%|███████▏ | 36/50 [00:09<00:03, 3.67it/s] 74%|███████▍ | 37/50 [00:10<00:03, 3.67it/s] 76%|███████▌ | 38/50 [00:10<00:03, 3.67it/s] 78%|███████▊ | 39/50 [00:10<00:02, 3.67it/s] 80%|████████ | 40/50 [00:10<00:02, 3.67it/s] 82%|████████▏ | 41/50 [00:11<00:02, 3.67it/s] 84%|████████▍ | 42/50 [00:11<00:02, 3.67it/s] 86%|████████▌ | 43/50 [00:11<00:01, 3.67it/s] 88%|████████▊ | 44/50 [00:11<00:01, 3.67it/s] 90%|█████████ | 45/50 [00:12<00:01, 3.67it/s] 92%|█████████▏| 46/50 [00:12<00:01, 3.67it/s] 94%|█████████▍| 47/50 [00:12<00:00, 3.67it/s] 96%|█████████▌| 48/50 [00:13<00:00, 3.67it/s] 98%|█████████▊| 49/50 [00:13<00:00, 3.67it/s] 100%|██████████| 50/50 [00:13<00:00, 3.67it/s] 100%|██████████| 50/50 [00:13<00:00, 3.68it/s]
Prediction
fmosele/synthiola:7e2989f9IDeof24ltbxzpvraxpgd76yky6xeStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
- width
- 1288
- height
- 1024
- prompt
- street fashion clothes inspired by TOK
- refine
- no_refiner
- scheduler
- K_EULER
- lora_scale
- 1
- num_outputs
- 4
- guidance_scale
- 7.5
- apply_watermark
- high_noise_frac
- 0.8
- prompt_strength
- 0.8
- num_inference_steps
- 50
{ "width": 1288, "height": 1024, "prompt": "street fashion clothes inspired by TOK ", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 4, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 }
Install Replicate’s Node.js client library:npm install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run fmosele/synthiola using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "fmosele/synthiola:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da", { input: { width: 1288, height: 1024, prompt: "street fashion clothes inspired by TOK ", refine: "no_refiner", scheduler: "K_EULER", lora_scale: 1, num_outputs: 4, guidance_scale: 7.5, apply_watermark: true, high_noise_frac: 0.8, prompt_strength: 0.8, num_inference_steps: 50 } } ); console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import the client:import replicate
Run fmosele/synthiola using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "fmosele/synthiola:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da", input={ "width": 1288, "height": 1024, "prompt": "street fashion clothes inspired by TOK ", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 4, "guidance_scale": 7.5, "apply_watermark": True, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Run fmosele/synthiola using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da", "input": { "width": 1288, "height": 1024, "prompt": "street fashion clothes inspired by TOK ", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 4, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/fmosele/synthiola@sha256:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da \ -i 'width=1288' \ -i 'height=1024' \ -i 'prompt="street fashion clothes inspired by TOK "' \ -i 'refine="no_refiner"' \ -i 'scheduler="K_EULER"' \ -i 'lora_scale=1' \ -i 'num_outputs=4' \ -i 'guidance_scale=7.5' \ -i 'apply_watermark=true' \ -i 'high_noise_frac=0.8' \ -i 'prompt_strength=0.8' \ -i 'num_inference_steps=50'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/fmosele/synthiola@sha256:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "width": 1288, "height": 1024, "prompt": "street fashion clothes inspired by TOK ", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 4, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2023-08-15T13:07:29.301802Z", "created_at": "2023-08-15T13:06:16.506991Z", "data_removed": false, "error": null, "id": "eof24ltbxzpvraxpgd76yky6xe", "input": { "width": 1288, "height": 1024, "prompt": "street fashion clothes inspired by TOK ", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 4, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 }, "logs": "Using seed: 51652\nPrompt: street fashion clothes inspired by <s0><s1>\ntxt2img mode\n 0%| | 0/50 [00:00<?, ?it/s]\n 2%|▏ | 1/50 [00:01<01:03, 1.29s/it]\n 4%|▍ | 2/50 [00:02<01:02, 1.30s/it]\n 6%|▌ | 3/50 [00:03<01:00, 1.30s/it]\n 8%|▊ | 4/50 [00:05<00:59, 1.30s/it]\n 10%|█ | 5/50 [00:06<00:58, 1.30s/it]\n 12%|█▏ | 6/50 [00:07<00:57, 1.30s/it]\n 14%|█▍ | 7/50 [00:09<00:55, 1.30s/it]\n 16%|█▌ | 8/50 [00:10<00:54, 1.29s/it]\n 18%|█▊ | 9/50 [00:11<00:53, 1.29s/it]\n 20%|██ | 10/50 [00:12<00:51, 1.29s/it]\n 22%|██▏ | 11/50 [00:14<00:50, 1.29s/it]\n 24%|██▍ | 12/50 [00:15<00:49, 1.29s/it]\n 26%|██▌ | 13/50 [00:16<00:47, 1.29s/it]\n 28%|██▊ | 14/50 [00:18<00:46, 1.29s/it]\n 30%|███ | 15/50 [00:19<00:45, 1.29s/it]\n 32%|███▏ | 16/50 [00:20<00:44, 1.30s/it]\n 34%|███▍ | 17/50 [00:22<00:42, 1.30s/it]\n 36%|███▌ | 18/50 [00:23<00:41, 1.30s/it]\n 38%|███▊ | 19/50 [00:24<00:40, 1.30s/it]\n 40%|████ | 20/50 [00:25<00:38, 1.30s/it]\n 42%|████▏ | 21/50 [00:27<00:37, 1.30s/it]\n 44%|████▍ | 22/50 [00:28<00:36, 1.30s/it]\n 46%|████▌ | 23/50 [00:29<00:35, 1.30s/it]\n 48%|████▊ | 24/50 [00:31<00:33, 1.30s/it]\n 50%|█████ | 25/50 [00:32<00:32, 1.30s/it]\n 52%|█████▏ | 26/50 [00:33<00:31, 1.30s/it]\n 54%|█████▍ | 27/50 [00:35<00:29, 1.30s/it]\n 56%|█████▌ | 28/50 [00:36<00:28, 1.31s/it]\n 58%|█████▊ | 29/50 [00:37<00:27, 1.31s/it]\n 60%|██████ | 30/50 [00:38<00:26, 1.31s/it]\n 62%|██████▏ | 31/50 [00:40<00:24, 1.30s/it]\n 64%|██████▍ | 32/50 [00:41<00:23, 1.30s/it]\n 66%|██████▌ | 33/50 [00:42<00:22, 1.30s/it]\n 68%|██████▊ | 34/50 [00:44<00:20, 1.30s/it]\n 70%|███████ | 35/50 [00:45<00:19, 1.30s/it]\n 72%|███████▏ | 36/50 [00:46<00:18, 1.30s/it]\n 74%|███████▍ | 37/50 [00:48<00:16, 1.31s/it]\n 76%|███████▌ | 38/50 [00:49<00:15, 1.31s/it]\n 78%|███████▊ | 39/50 [00:50<00:14, 1.31s/it]\n 80%|████████ | 40/50 [00:52<00:13, 1.31s/it]\n 82%|████████▏ | 41/50 [00:53<00:11, 1.31s/it]\n 84%|████████▍ | 42/50 [00:54<00:10, 1.31s/it]\n 86%|████████▌ | 43/50 [00:55<00:09, 1.31s/it]\n 88%|████████▊ | 44/50 [00:57<00:07, 1.31s/it]\n 90%|█████████ | 45/50 [00:58<00:06, 1.31s/it]\n 92%|█████████▏| 46/50 [00:59<00:05, 1.31s/it]\n 94%|█████████▍| 47/50 [01:01<00:03, 1.30s/it]\n 96%|█████████▌| 48/50 [01:02<00:02, 1.30s/it]\n 98%|█████████▊| 49/50 [01:03<00:01, 1.30s/it]\n100%|██████████| 50/50 [01:05<00:00, 1.30s/it]\n100%|██████████| 50/50 [01:05<00:00, 1.30s/it]", "metrics": { "predict_time": 72.788663, "total_time": 72.794811 }, "output": [ "https://replicate.delivery/pbxt/DVZLLAE9G1pNIpvBxbzAftdlF8YShBi3b5Gf6UGzUQ1OqSaRA/out-0.png", "https://replicate.delivery/pbxt/QengfLdxSoihNEMUZYMXXcl1mHzYak4IDxmR1WvK275PqSaRA/out-1.png", "https://replicate.delivery/pbxt/eX7LfqZytzvKepn7j58iudAWrTiO1eGbxlTki6yy0jQApKpFB/out-2.png", "https://replicate.delivery/pbxt/fKerGIVBasuKtEJJRswuCyz81twWuxfpz1qEKis0gp1gUl0iA/out-3.png" ], "started_at": "2023-08-15T13:06:16.513139Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/eof24ltbxzpvraxpgd76yky6xe", "cancel": "https://api.replicate.com/v1/predictions/eof24ltbxzpvraxpgd76yky6xe/cancel" }, "version": "7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da" }
Generated inUsing seed: 51652 Prompt: street fashion clothes inspired by <s0><s1> txt2img mode 0%| | 0/50 [00:00<?, ?it/s] 2%|▏ | 1/50 [00:01<01:03, 1.29s/it] 4%|▍ | 2/50 [00:02<01:02, 1.30s/it] 6%|▌ | 3/50 [00:03<01:00, 1.30s/it] 8%|▊ | 4/50 [00:05<00:59, 1.30s/it] 10%|█ | 5/50 [00:06<00:58, 1.30s/it] 12%|█▏ | 6/50 [00:07<00:57, 1.30s/it] 14%|█▍ | 7/50 [00:09<00:55, 1.30s/it] 16%|█▌ | 8/50 [00:10<00:54, 1.29s/it] 18%|█▊ | 9/50 [00:11<00:53, 1.29s/it] 20%|██ | 10/50 [00:12<00:51, 1.29s/it] 22%|██▏ | 11/50 [00:14<00:50, 1.29s/it] 24%|██▍ | 12/50 [00:15<00:49, 1.29s/it] 26%|██▌ | 13/50 [00:16<00:47, 1.29s/it] 28%|██▊ | 14/50 [00:18<00:46, 1.29s/it] 30%|███ | 15/50 [00:19<00:45, 1.29s/it] 32%|███▏ | 16/50 [00:20<00:44, 1.30s/it] 34%|███▍ | 17/50 [00:22<00:42, 1.30s/it] 36%|███▌ | 18/50 [00:23<00:41, 1.30s/it] 38%|███▊ | 19/50 [00:24<00:40, 1.30s/it] 40%|████ | 20/50 [00:25<00:38, 1.30s/it] 42%|████▏ | 21/50 [00:27<00:37, 1.30s/it] 44%|████▍ | 22/50 [00:28<00:36, 1.30s/it] 46%|████▌ | 23/50 [00:29<00:35, 1.30s/it] 48%|████▊ | 24/50 [00:31<00:33, 1.30s/it] 50%|█████ | 25/50 [00:32<00:32, 1.30s/it] 52%|█████▏ | 26/50 [00:33<00:31, 1.30s/it] 54%|█████▍ | 27/50 [00:35<00:29, 1.30s/it] 56%|█████▌ | 28/50 [00:36<00:28, 1.31s/it] 58%|█████▊ | 29/50 [00:37<00:27, 1.31s/it] 60%|██████ | 30/50 [00:38<00:26, 1.31s/it] 62%|██████▏ | 31/50 [00:40<00:24, 1.30s/it] 64%|██████▍ | 32/50 [00:41<00:23, 1.30s/it] 66%|██████▌ | 33/50 [00:42<00:22, 1.30s/it] 68%|██████▊ | 34/50 [00:44<00:20, 1.30s/it] 70%|███████ | 35/50 [00:45<00:19, 1.30s/it] 72%|███████▏ | 36/50 [00:46<00:18, 1.30s/it] 74%|███████▍ | 37/50 [00:48<00:16, 1.31s/it] 76%|███████▌ | 38/50 [00:49<00:15, 1.31s/it] 78%|███████▊ | 39/50 [00:50<00:14, 1.31s/it] 80%|████████ | 40/50 [00:52<00:13, 1.31s/it] 82%|████████▏ | 41/50 [00:53<00:11, 1.31s/it] 84%|████████▍ | 42/50 [00:54<00:10, 1.31s/it] 86%|████████▌ | 43/50 [00:55<00:09, 1.31s/it] 88%|████████▊ | 44/50 [00:57<00:07, 1.31s/it] 90%|█████████ | 45/50 [00:58<00:06, 1.31s/it] 92%|█████████▏| 46/50 [00:59<00:05, 1.31s/it] 94%|█████████▍| 47/50 [01:01<00:03, 1.30s/it] 96%|█████████▌| 48/50 [01:02<00:02, 1.30s/it] 98%|█████████▊| 49/50 [01:03<00:01, 1.30s/it] 100%|██████████| 50/50 [01:05<00:00, 1.30s/it] 100%|██████████| 50/50 [01:05<00:00, 1.30s/it]
Prediction
fmosele/synthiola:7e2989f9IDiredootbh5iij7zlpithmkpzxiStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
- width
- 1024
- height
- 1024
- prompt
- a render of TOK from Cocomelon video as TOK skiing
- refine
- no_refiner
- scheduler
- K_EULER
- lora_scale
- 1
- num_outputs
- 1
- guidance_scale
- 7.5
- apply_watermark
- high_noise_frac
- 0.8
- prompt_strength
- 0.8
- num_inference_steps
- 50
{ "width": 1024, "height": 1024, "prompt": "a render of TOK from Cocomelon video as TOK skiing", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 }
Install Replicate’s Node.js client library:npm install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run fmosele/synthiola using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "fmosele/synthiola:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da", { input: { width: 1024, height: 1024, prompt: "a render of TOK from Cocomelon video as TOK skiing", refine: "no_refiner", scheduler: "K_EULER", lora_scale: 1, num_outputs: 1, guidance_scale: 7.5, apply_watermark: true, high_noise_frac: 0.8, prompt_strength: 0.8, num_inference_steps: 50 } } ); console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import the client:import replicate
Run fmosele/synthiola using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "fmosele/synthiola:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da", input={ "width": 1024, "height": 1024, "prompt": "a render of TOK from Cocomelon video as TOK skiing", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": True, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Run fmosele/synthiola using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da", "input": { "width": 1024, "height": 1024, "prompt": "a render of TOK from Cocomelon video as TOK skiing", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/fmosele/synthiola@sha256:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da \ -i 'width=1024' \ -i 'height=1024' \ -i 'prompt="a render of TOK from Cocomelon video as TOK skiing"' \ -i 'refine="no_refiner"' \ -i 'scheduler="K_EULER"' \ -i 'lora_scale=1' \ -i 'num_outputs=1' \ -i 'guidance_scale=7.5' \ -i 'apply_watermark=true' \ -i 'high_noise_frac=0.8' \ -i 'prompt_strength=0.8' \ -i 'num_inference_steps=50'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/fmosele/synthiola@sha256:7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "width": 1024, "height": 1024, "prompt": "a render of TOK from Cocomelon video as TOK skiing", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2023-08-15T14:05:29.717739Z", "created_at": "2023-08-15T14:05:14.020042Z", "data_removed": false, "error": null, "id": "iredootbh5iij7zlpithmkpzxi", "input": { "width": 1024, "height": 1024, "prompt": "a render of TOK from Cocomelon video as TOK skiing", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 1, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "prompt_strength": 0.8, "num_inference_steps": 50 }, "logs": "Using seed: 39418\nPrompt: a render of <s0><s1> from Cocomelon video as <s0><s1> skiing\ntxt2img mode\n 0%| | 0/50 [00:00<?, ?it/s]\n 2%|▏ | 1/50 [00:00<00:13, 3.69it/s]\n 4%|▍ | 2/50 [00:00<00:13, 3.68it/s]\n 6%|▌ | 3/50 [00:00<00:12, 3.68it/s]\n 8%|▊ | 4/50 [00:01<00:12, 3.66it/s]\n 10%|█ | 5/50 [00:01<00:12, 3.66it/s]\n 12%|█▏ | 6/50 [00:01<00:12, 3.66it/s]\n 14%|█▍ | 7/50 [00:01<00:11, 3.66it/s]\n 16%|█▌ | 8/50 [00:02<00:11, 3.66it/s]\n 18%|█▊ | 9/50 [00:02<00:11, 3.66it/s]\n 20%|██ | 10/50 [00:02<00:10, 3.66it/s]\n 22%|██▏ | 11/50 [00:03<00:10, 3.65it/s]\n 24%|██▍ | 12/50 [00:03<00:10, 3.65it/s]\n 26%|██▌ | 13/50 [00:03<00:10, 3.65it/s]\n 28%|██▊ | 14/50 [00:03<00:09, 3.65it/s]\n 30%|███ | 15/50 [00:04<00:09, 3.65it/s]\n 32%|███▏ | 16/50 [00:04<00:09, 3.65it/s]\n 34%|███▍ | 17/50 [00:04<00:09, 3.65it/s]\n 36%|███▌ | 18/50 [00:04<00:08, 3.65it/s]\n 38%|███▊ | 19/50 [00:05<00:08, 3.65it/s]\n 40%|████ | 20/50 [00:05<00:08, 3.65it/s]\n 42%|████▏ | 21/50 [00:05<00:07, 3.65it/s]\n 44%|████▍ | 22/50 [00:06<00:07, 3.64it/s]\n 46%|████▌ | 23/50 [00:06<00:07, 3.64it/s]\n 48%|████▊ | 24/50 [00:06<00:07, 3.65it/s]\n 50%|█████ | 25/50 [00:06<00:06, 3.65it/s]\n 52%|█████▏ | 26/50 [00:07<00:06, 3.64it/s]\n 54%|█████▍ | 27/50 [00:07<00:06, 3.65it/s]\n 56%|█████▌ | 28/50 [00:07<00:06, 3.65it/s]\n 58%|█████▊ | 29/50 [00:07<00:05, 3.64it/s]\n 60%|██████ | 30/50 [00:08<00:05, 3.64it/s]\n 62%|██████▏ | 31/50 [00:08<00:05, 3.65it/s]\n 64%|██████▍ | 32/50 [00:08<00:04, 3.64it/s]\n 66%|██████▌ | 33/50 [00:09<00:04, 3.64it/s]\n 68%|██████▊ | 34/50 [00:09<00:04, 3.64it/s]\n 70%|███████ | 35/50 [00:09<00:04, 3.64it/s]\n 72%|███████▏ | 36/50 [00:09<00:03, 3.64it/s]\n 74%|███████▍ | 37/50 [00:10<00:03, 3.64it/s]\n 76%|███████▌ | 38/50 [00:10<00:03, 3.64it/s]\n 78%|███████▊ | 39/50 [00:10<00:03, 3.64it/s]\n 80%|████████ | 40/50 [00:10<00:02, 3.64it/s]\n 82%|████████▏ | 41/50 [00:11<00:02, 3.64it/s]\n 84%|████████▍ | 42/50 [00:11<00:02, 3.64it/s]\n 86%|████████▌ | 43/50 [00:11<00:01, 3.64it/s]\n 88%|████████▊ | 44/50 [00:12<00:01, 3.64it/s]\n 90%|█████████ | 45/50 [00:12<00:01, 3.64it/s]\n 92%|█████████▏| 46/50 [00:12<00:01, 3.64it/s]\n 94%|█████████▍| 47/50 [00:12<00:00, 3.64it/s]\n 96%|█████████▌| 48/50 [00:13<00:00, 3.64it/s]\n 98%|█████████▊| 49/50 [00:13<00:00, 3.64it/s]\n100%|██████████| 50/50 [00:13<00:00, 3.64it/s]\n100%|██████████| 50/50 [00:13<00:00, 3.65it/s]", "metrics": { "predict_time": 15.750096, "total_time": 15.697697 }, "output": [ "https://replicate.delivery/pbxt/0fPx7jizWiwuXKBpEvclNyheqShVo2jylGdDMdNIdEDogTaRA/out-0.png" ], "started_at": "2023-08-15T14:05:13.967643Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/iredootbh5iij7zlpithmkpzxi", "cancel": "https://api.replicate.com/v1/predictions/iredootbh5iij7zlpithmkpzxi/cancel" }, "version": "7e2989f94572be82f355ee7cecb4bb1dfe1b7a403bc9a3bebc023692540027da" }
Generated inUsing seed: 39418 Prompt: a render of <s0><s1> from Cocomelon video as <s0><s1> skiing txt2img mode 0%| | 0/50 [00:00<?, ?it/s] 2%|▏ | 1/50 [00:00<00:13, 3.69it/s] 4%|▍ | 2/50 [00:00<00:13, 3.68it/s] 6%|▌ | 3/50 [00:00<00:12, 3.68it/s] 8%|▊ | 4/50 [00:01<00:12, 3.66it/s] 10%|█ | 5/50 [00:01<00:12, 3.66it/s] 12%|█▏ | 6/50 [00:01<00:12, 3.66it/s] 14%|█▍ | 7/50 [00:01<00:11, 3.66it/s] 16%|█▌ | 8/50 [00:02<00:11, 3.66it/s] 18%|█▊ | 9/50 [00:02<00:11, 3.66it/s] 20%|██ | 10/50 [00:02<00:10, 3.66it/s] 22%|██▏ | 11/50 [00:03<00:10, 3.65it/s] 24%|██▍ | 12/50 [00:03<00:10, 3.65it/s] 26%|██▌ | 13/50 [00:03<00:10, 3.65it/s] 28%|██▊ | 14/50 [00:03<00:09, 3.65it/s] 30%|███ | 15/50 [00:04<00:09, 3.65it/s] 32%|███▏ | 16/50 [00:04<00:09, 3.65it/s] 34%|███▍ | 17/50 [00:04<00:09, 3.65it/s] 36%|███▌ | 18/50 [00:04<00:08, 3.65it/s] 38%|███▊ | 19/50 [00:05<00:08, 3.65it/s] 40%|████ | 20/50 [00:05<00:08, 3.65it/s] 42%|████▏ | 21/50 [00:05<00:07, 3.65it/s] 44%|████▍ | 22/50 [00:06<00:07, 3.64it/s] 46%|████▌ | 23/50 [00:06<00:07, 3.64it/s] 48%|████▊ | 24/50 [00:06<00:07, 3.65it/s] 50%|█████ | 25/50 [00:06<00:06, 3.65it/s] 52%|█████▏ | 26/50 [00:07<00:06, 3.64it/s] 54%|█████▍ | 27/50 [00:07<00:06, 3.65it/s] 56%|█████▌ | 28/50 [00:07<00:06, 3.65it/s] 58%|█████▊ | 29/50 [00:07<00:05, 3.64it/s] 60%|██████ | 30/50 [00:08<00:05, 3.64it/s] 62%|██████▏ | 31/50 [00:08<00:05, 3.65it/s] 64%|██████▍ | 32/50 [00:08<00:04, 3.64it/s] 66%|██████▌ | 33/50 [00:09<00:04, 3.64it/s] 68%|██████▊ | 34/50 [00:09<00:04, 3.64it/s] 70%|███████ | 35/50 [00:09<00:04, 3.64it/s] 72%|███████▏ | 36/50 [00:09<00:03, 3.64it/s] 74%|███████▍ | 37/50 [00:10<00:03, 3.64it/s] 76%|███████▌ | 38/50 [00:10<00:03, 3.64it/s] 78%|███████▊ | 39/50 [00:10<00:03, 3.64it/s] 80%|████████ | 40/50 [00:10<00:02, 3.64it/s] 82%|████████▏ | 41/50 [00:11<00:02, 3.64it/s] 84%|████████▍ | 42/50 [00:11<00:02, 3.64it/s] 86%|████████▌ | 43/50 [00:11<00:01, 3.64it/s] 88%|████████▊ | 44/50 [00:12<00:01, 3.64it/s] 90%|█████████ | 45/50 [00:12<00:01, 3.64it/s] 92%|█████████▏| 46/50 [00:12<00:01, 3.64it/s] 94%|█████████▍| 47/50 [00:12<00:00, 3.64it/s] 96%|█████████▌| 48/50 [00:13<00:00, 3.64it/s] 98%|█████████▊| 49/50 [00:13<00:00, 3.64it/s] 100%|██████████| 50/50 [00:13<00:00, 3.64it/s] 100%|██████████| 50/50 [00:13<00:00, 3.65it/s]
Want to make some of these yourself?
Run this model