linoytsaban / linoy_lora
- Public
- 172 runs
-
L40S
- SDXL fine-tune
Prediction
linoytsaban/linoy_lora:7069cc4dbf4f8a5c46cffe5d697e7b88a1867f18d42a4d9b45eb79b1d12db39eID3gcubdtb36p77ih54ynobypx7eStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
- seed
- 2000
- width
- 1024
- height
- 1024
- prompt
- a hugging face emoji in the style of TOK, as an astronaut riding a rainbow horse
- refine
- no_refiner
- scheduler
- K_EULER
- lora_scale
- 0.8
- num_outputs
- 1
- guidance_scale
- 7.5
- apply_watermark
- high_noise_frac
- 0.8
- negative_prompt
- prompt_strength
- 0.8
- num_inference_steps
- 50
{ "seed": 2000, "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, as an astronaut riding a rainbow horse", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.8, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run linoytsaban/linoy_lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "linoytsaban/linoy_lora:7069cc4dbf4f8a5c46cffe5d697e7b88a1867f18d42a4d9b45eb79b1d12db39e", { input: { seed: 2000, width: 1024, height: 1024, prompt: "a hugging face emoji in the style of TOK, as an astronaut riding a rainbow horse", refine: "no_refiner", scheduler: "K_EULER", lora_scale: 0.8, num_outputs: 1, guidance_scale: 7.5, apply_watermark: true, high_noise_frac: 0.8, negative_prompt: "", prompt_strength: 0.8, num_inference_steps: 50 } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run linoytsaban/linoy_lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "linoytsaban/linoy_lora:7069cc4dbf4f8a5c46cffe5d697e7b88a1867f18d42a4d9b45eb79b1d12db39e", input={ "seed": 2000, "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, as an astronaut riding a rainbow horse", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.8, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": True, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 } ) # To access the file URL: print(output[0].url()) #=> "http://example.com" # To write the file to disk: with open("my-image.png", "wb") as file: file.write(output[0].read())
To learn more, take a look at the guide on getting started with Python.
Run linoytsaban/linoy_lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "linoytsaban/linoy_lora:7069cc4dbf4f8a5c46cffe5d697e7b88a1867f18d42a4d9b45eb79b1d12db39e", "input": { "seed": 2000, "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, as an astronaut riding a rainbow horse", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.8, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/linoytsaban/linoy_lora@sha256:7069cc4dbf4f8a5c46cffe5d697e7b88a1867f18d42a4d9b45eb79b1d12db39e \ -i 'seed=2000' \ -i 'width=1024' \ -i 'height=1024' \ -i 'prompt="a hugging face emoji in the style of TOK, as an astronaut riding a rainbow horse"' \ -i 'refine="no_refiner"' \ -i 'scheduler="K_EULER"' \ -i 'lora_scale=0.8' \ -i 'num_outputs=1' \ -i 'guidance_scale=7.5' \ -i 'apply_watermark=true' \ -i 'high_noise_frac=0.8' \ -i 'negative_prompt=""' \ -i 'prompt_strength=0.8' \ -i 'num_inference_steps=50'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/linoytsaban/linoy_lora@sha256:7069cc4dbf4f8a5c46cffe5d697e7b88a1867f18d42a4d9b45eb79b1d12db39e
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "seed": 2000, "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, as an astronaut riding a rainbow horse", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.8, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2023-09-19T09:14:12.397851Z", "created_at": "2023-09-19T09:13:56.905166Z", "data_removed": false, "error": null, "id": "3gcubdtb36p77ih54ynobypx7e", "input": { "seed": 2000, "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, as an astronaut riding a rainbow horse", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.8, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 }, "logs": "Using seed: 2000\nPrompt: a hugging face emoji in the style of <s0><s1>, as an astronaut riding a rainbow horse\ntxt2img mode\n 0%| | 0/50 [00:00<?, ?it/s]\n 2%|▏ | 1/50 [00:00<00:13, 3.68it/s]\n 4%|▍ | 2/50 [00:00<00:13, 3.67it/s]\n 6%|▌ | 3/50 [00:00<00:12, 3.68it/s]\n 8%|▊ | 4/50 [00:01<00:12, 3.67it/s]\n 10%|█ | 5/50 [00:01<00:12, 3.66it/s]\n 12%|█▏ | 6/50 [00:01<00:12, 3.66it/s]\n 14%|█▍ | 7/50 [00:01<00:11, 3.66it/s]\n 16%|█▌ | 8/50 [00:02<00:11, 3.65it/s]\n 18%|█▊ | 9/50 [00:02<00:11, 3.65it/s]\n 20%|██ | 10/50 [00:02<00:10, 3.65it/s]\n 22%|██▏ | 11/50 [00:03<00:10, 3.65it/s]\n 24%|██▍ | 12/50 [00:03<00:10, 3.65it/s]\n 26%|██▌ | 13/50 [00:03<00:10, 3.65it/s]\n 28%|██▊ | 14/50 [00:03<00:09, 3.65it/s]\n 30%|███ | 15/50 [00:04<00:09, 3.65it/s]\n 32%|███▏ | 16/50 [00:04<00:09, 3.65it/s]\n 34%|███▍ | 17/50 [00:04<00:09, 3.65it/s]\n 36%|███▌ | 18/50 [00:04<00:08, 3.65it/s]\n 38%|███▊ | 19/50 [00:05<00:08, 3.65it/s]\n 40%|████ | 20/50 [00:05<00:08, 3.65it/s]\n 42%|████▏ | 21/50 [00:05<00:07, 3.65it/s]\n 44%|████▍ | 22/50 [00:06<00:07, 3.64it/s]\n 46%|████▌ | 23/50 [00:06<00:07, 3.65it/s]\n 48%|████▊ | 24/50 [00:06<00:07, 3.64it/s]\n 50%|█████ | 25/50 [00:06<00:06, 3.65it/s]\n 52%|█████▏ | 26/50 [00:07<00:06, 3.64it/s]\n 54%|█████▍ | 27/50 [00:07<00:06, 3.64it/s]\n 56%|█████▌ | 28/50 [00:07<00:06, 3.64it/s]\n 58%|█████▊ | 29/50 [00:07<00:05, 3.64it/s]\n 60%|██████ | 30/50 [00:08<00:05, 3.64it/s]\n 62%|██████▏ | 31/50 [00:08<00:05, 3.64it/s]\n 64%|██████▍ | 32/50 [00:08<00:04, 3.64it/s]\n 66%|██████▌ | 33/50 [00:09<00:04, 3.64it/s]\n 68%|██████▊ | 34/50 [00:09<00:04, 3.64it/s]\n 70%|███████ | 35/50 [00:09<00:04, 3.64it/s]\n 72%|███████▏ | 36/50 [00:09<00:03, 3.64it/s]\n 74%|███████▍ | 37/50 [00:10<00:03, 3.64it/s]\n 76%|███████▌ | 38/50 [00:10<00:03, 3.64it/s]\n 78%|███████▊ | 39/50 [00:10<00:03, 3.65it/s]\n 80%|████████ | 40/50 [00:10<00:02, 3.64it/s]\n 82%|████████▏ | 41/50 [00:11<00:02, 3.64it/s]\n 84%|████████▍ | 42/50 [00:11<00:02, 3.64it/s]\n 86%|████████▌ | 43/50 [00:11<00:01, 3.64it/s]\n 88%|████████▊ | 44/50 [00:12<00:01, 3.64it/s]\n 90%|█████████ | 45/50 [00:12<00:01, 3.64it/s]\n 92%|█████████▏| 46/50 [00:12<00:01, 3.64it/s]\n 94%|█████████▍| 47/50 [00:12<00:00, 3.64it/s]\n 96%|█████████▌| 48/50 [00:13<00:00, 3.64it/s]\n 98%|█████████▊| 49/50 [00:13<00:00, 3.64it/s]\n100%|██████████| 50/50 [00:13<00:00, 3.64it/s]\n100%|██████████| 50/50 [00:13<00:00, 3.65it/s]", "metrics": { "predict_time": 15.496015, "total_time": 15.492685 }, "output": [ "https://replicate.delivery/pbxt/AQffRvXcZlhqkkelglIZOf6BwVedmieWWF6YbrmEqOetxw4yIA/out-0.png" ], "started_at": "2023-09-19T09:13:56.901836Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/3gcubdtb36p77ih54ynobypx7e", "cancel": "https://api.replicate.com/v1/predictions/3gcubdtb36p77ih54ynobypx7e/cancel" }, "version": "7069cc4dbf4f8a5c46cffe5d697e7b88a1867f18d42a4d9b45eb79b1d12db39e" }
Generated inUsing seed: 2000 Prompt: a hugging face emoji in the style of <s0><s1>, as an astronaut riding a rainbow horse txt2img mode 0%| | 0/50 [00:00<?, ?it/s] 2%|▏ | 1/50 [00:00<00:13, 3.68it/s] 4%|▍ | 2/50 [00:00<00:13, 3.67it/s] 6%|▌ | 3/50 [00:00<00:12, 3.68it/s] 8%|▊ | 4/50 [00:01<00:12, 3.67it/s] 10%|█ | 5/50 [00:01<00:12, 3.66it/s] 12%|█▏ | 6/50 [00:01<00:12, 3.66it/s] 14%|█▍ | 7/50 [00:01<00:11, 3.66it/s] 16%|█▌ | 8/50 [00:02<00:11, 3.65it/s] 18%|█▊ | 9/50 [00:02<00:11, 3.65it/s] 20%|██ | 10/50 [00:02<00:10, 3.65it/s] 22%|██▏ | 11/50 [00:03<00:10, 3.65it/s] 24%|██▍ | 12/50 [00:03<00:10, 3.65it/s] 26%|██▌ | 13/50 [00:03<00:10, 3.65it/s] 28%|██▊ | 14/50 [00:03<00:09, 3.65it/s] 30%|███ | 15/50 [00:04<00:09, 3.65it/s] 32%|███▏ | 16/50 [00:04<00:09, 3.65it/s] 34%|███▍ | 17/50 [00:04<00:09, 3.65it/s] 36%|███▌ | 18/50 [00:04<00:08, 3.65it/s] 38%|███▊ | 19/50 [00:05<00:08, 3.65it/s] 40%|████ | 20/50 [00:05<00:08, 3.65it/s] 42%|████▏ | 21/50 [00:05<00:07, 3.65it/s] 44%|████▍ | 22/50 [00:06<00:07, 3.64it/s] 46%|████▌ | 23/50 [00:06<00:07, 3.65it/s] 48%|████▊ | 24/50 [00:06<00:07, 3.64it/s] 50%|█████ | 25/50 [00:06<00:06, 3.65it/s] 52%|█████▏ | 26/50 [00:07<00:06, 3.64it/s] 54%|█████▍ | 27/50 [00:07<00:06, 3.64it/s] 56%|█████▌ | 28/50 [00:07<00:06, 3.64it/s] 58%|█████▊ | 29/50 [00:07<00:05, 3.64it/s] 60%|██████ | 30/50 [00:08<00:05, 3.64it/s] 62%|██████▏ | 31/50 [00:08<00:05, 3.64it/s] 64%|██████▍ | 32/50 [00:08<00:04, 3.64it/s] 66%|██████▌ | 33/50 [00:09<00:04, 3.64it/s] 68%|██████▊ | 34/50 [00:09<00:04, 3.64it/s] 70%|███████ | 35/50 [00:09<00:04, 3.64it/s] 72%|███████▏ | 36/50 [00:09<00:03, 3.64it/s] 74%|███████▍ | 37/50 [00:10<00:03, 3.64it/s] 76%|███████▌ | 38/50 [00:10<00:03, 3.64it/s] 78%|███████▊ | 39/50 [00:10<00:03, 3.65it/s] 80%|████████ | 40/50 [00:10<00:02, 3.64it/s] 82%|████████▏ | 41/50 [00:11<00:02, 3.64it/s] 84%|████████▍ | 42/50 [00:11<00:02, 3.64it/s] 86%|████████▌ | 43/50 [00:11<00:01, 3.64it/s] 88%|████████▊ | 44/50 [00:12<00:01, 3.64it/s] 90%|█████████ | 45/50 [00:12<00:01, 3.64it/s] 92%|█████████▏| 46/50 [00:12<00:01, 3.64it/s] 94%|█████████▍| 47/50 [00:12<00:00, 3.64it/s] 96%|█████████▌| 48/50 [00:13<00:00, 3.64it/s] 98%|█████████▊| 49/50 [00:13<00:00, 3.64it/s] 100%|██████████| 50/50 [00:13<00:00, 3.64it/s] 100%|██████████| 50/50 [00:13<00:00, 3.65it/s]
Prediction
linoytsaban/linoy_lora:591c0cefa34c600f70a6dfff41a9e32991db21020f4ad41f65395046cbefd968IDoagc42tbv6w6rjxfgzynpu3xciStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
- width
- 1024
- height
- 1024
- prompt
- a hugging face emoji in the style of TOK, dressed as yoda
- refine
- no_refiner
- scheduler
- K_EULER
- lora_scale
- 0.8
- num_outputs
- 1
- guidance_scale
- 7.5
- apply_watermark
- high_noise_frac
- 0.8
- negative_prompt
- prompt_strength
- 0.8
- num_inference_steps
- 50
{ "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, dressed as yoda", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.8, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run linoytsaban/linoy_lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "linoytsaban/linoy_lora:591c0cefa34c600f70a6dfff41a9e32991db21020f4ad41f65395046cbefd968", { input: { width: 1024, height: 1024, prompt: "a hugging face emoji in the style of TOK, dressed as yoda", refine: "no_refiner", scheduler: "K_EULER", lora_scale: 0.8, num_outputs: 1, guidance_scale: 7.5, apply_watermark: true, high_noise_frac: 0.8, negative_prompt: "", prompt_strength: 0.8, num_inference_steps: 50 } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run linoytsaban/linoy_lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "linoytsaban/linoy_lora:591c0cefa34c600f70a6dfff41a9e32991db21020f4ad41f65395046cbefd968", input={ "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, dressed as yoda", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.8, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": True, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 } ) # To access the file URL: print(output[0].url()) #=> "http://example.com" # To write the file to disk: with open("my-image.png", "wb") as file: file.write(output[0].read())
To learn more, take a look at the guide on getting started with Python.
Run linoytsaban/linoy_lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "linoytsaban/linoy_lora:591c0cefa34c600f70a6dfff41a9e32991db21020f4ad41f65395046cbefd968", "input": { "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, dressed as yoda", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.8, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/linoytsaban/linoy_lora@sha256:591c0cefa34c600f70a6dfff41a9e32991db21020f4ad41f65395046cbefd968 \ -i 'width=1024' \ -i 'height=1024' \ -i 'prompt="a hugging face emoji in the style of TOK, dressed as yoda"' \ -i 'refine="no_refiner"' \ -i 'scheduler="K_EULER"' \ -i 'lora_scale=0.8' \ -i 'num_outputs=1' \ -i 'guidance_scale=7.5' \ -i 'apply_watermark=true' \ -i 'high_noise_frac=0.8' \ -i 'negative_prompt=""' \ -i 'prompt_strength=0.8' \ -i 'num_inference_steps=50'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/linoytsaban/linoy_lora@sha256:591c0cefa34c600f70a6dfff41a9e32991db21020f4ad41f65395046cbefd968
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, dressed as yoda", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.8, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2023-09-16T13:34:34.510108Z", "created_at": "2023-09-16T13:34:18.894883Z", "data_removed": false, "error": null, "id": "oagc42tbv6w6rjxfgzynpu3xci", "input": { "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, dressed as yoda", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.8, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 }, "logs": "Using seed: 43787\nPrompt: a hugging face emoji in the style of <s0><s1>, dressed as yoda\ntxt2img mode\n 0%| | 0/50 [00:00<?, ?it/s]\n 2%|▏ | 1/50 [00:00<00:13, 3.68it/s]\n 4%|▍ | 2/50 [00:00<00:13, 3.67it/s]\n 6%|▌ | 3/50 [00:00<00:12, 3.67it/s]\n 8%|▊ | 4/50 [00:01<00:12, 3.67it/s]\n 10%|█ | 5/50 [00:01<00:12, 3.67it/s]\n 12%|█▏ | 6/50 [00:01<00:11, 3.68it/s]\n 14%|█▍ | 7/50 [00:01<00:11, 3.67it/s]\n 16%|█▌ | 8/50 [00:02<00:11, 3.67it/s]\n 18%|█▊ | 9/50 [00:02<00:11, 3.67it/s]\n 20%|██ | 10/50 [00:02<00:10, 3.67it/s]\n 22%|██▏ | 11/50 [00:02<00:10, 3.67it/s]\n 24%|██▍ | 12/50 [00:03<00:10, 3.67it/s]\n 26%|██▌ | 13/50 [00:03<00:10, 3.67it/s]\n 28%|██▊ | 14/50 [00:03<00:09, 3.67it/s]\n 30%|███ | 15/50 [00:04<00:09, 3.67it/s]\n 32%|███▏ | 16/50 [00:04<00:09, 3.66it/s]\n 34%|███▍ | 17/50 [00:04<00:09, 3.66it/s]\n 36%|███▌ | 18/50 [00:04<00:08, 3.66it/s]\n 38%|███▊ | 19/50 [00:05<00:08, 3.66it/s]\n 40%|████ | 20/50 [00:05<00:08, 3.66it/s]\n 42%|████▏ | 21/50 [00:05<00:07, 3.66it/s]\n 44%|████▍ | 22/50 [00:06<00:07, 3.65it/s]\n 46%|████▌ | 23/50 [00:06<00:07, 3.65it/s]\n 48%|████▊ | 24/50 [00:06<00:07, 3.65it/s]\n 50%|█████ | 25/50 [00:06<00:06, 3.65it/s]\n 52%|█████▏ | 26/50 [00:07<00:06, 3.65it/s]\n 54%|█████▍ | 27/50 [00:07<00:06, 3.65it/s]\n 56%|█████▌ | 28/50 [00:07<00:06, 3.66it/s]\n 58%|█████▊ | 29/50 [00:07<00:05, 3.66it/s]\n 60%|██████ | 30/50 [00:08<00:05, 3.66it/s]\n 62%|██████▏ | 31/50 [00:08<00:05, 3.66it/s]\n 64%|██████▍ | 32/50 [00:08<00:04, 3.65it/s]\n 66%|██████▌ | 33/50 [00:09<00:04, 3.65it/s]\n 68%|██████▊ | 34/50 [00:09<00:04, 3.65it/s]\n 70%|███████ | 35/50 [00:09<00:04, 3.65it/s]\n 72%|███████▏ | 36/50 [00:09<00:03, 3.65it/s]\n 74%|███████▍ | 37/50 [00:10<00:03, 3.65it/s]\n 76%|███████▌ | 38/50 [00:10<00:03, 3.65it/s]\n 78%|███████▊ | 39/50 [00:10<00:03, 3.65it/s]\n 80%|████████ | 40/50 [00:10<00:02, 3.65it/s]\n 82%|████████▏ | 41/50 [00:11<00:02, 3.65it/s]\n 84%|████████▍ | 42/50 [00:11<00:02, 3.65it/s]\n 86%|████████▌ | 43/50 [00:11<00:01, 3.65it/s]\n 88%|████████▊ | 44/50 [00:12<00:01, 3.65it/s]\n 90%|█████████ | 45/50 [00:12<00:01, 3.65it/s]\n 92%|█████████▏| 46/50 [00:12<00:01, 3.65it/s]\n 94%|█████████▍| 47/50 [00:12<00:00, 3.65it/s]\n 96%|█████████▌| 48/50 [00:13<00:00, 3.65it/s]\n 98%|█████████▊| 49/50 [00:13<00:00, 3.65it/s]\n100%|██████████| 50/50 [00:13<00:00, 3.65it/s]\n100%|██████████| 50/50 [00:13<00:00, 3.66it/s]", "metrics": { "predict_time": 15.619638, "total_time": 15.615225 }, "output": [ "https://pbxt.replicate.delivery/wJf4lByhD10xCyqaAp2sgsJYW8Xw99sbgue5Fyvj176pD2kRA/out-0.png" ], "started_at": "2023-09-16T13:34:18.890470Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/oagc42tbv6w6rjxfgzynpu3xci", "cancel": "https://api.replicate.com/v1/predictions/oagc42tbv6w6rjxfgzynpu3xci/cancel" }, "version": "591c0cefa34c600f70a6dfff41a9e32991db21020f4ad41f65395046cbefd968" }
Generated inUsing seed: 43787 Prompt: a hugging face emoji in the style of <s0><s1>, dressed as yoda txt2img mode 0%| | 0/50 [00:00<?, ?it/s] 2%|▏ | 1/50 [00:00<00:13, 3.68it/s] 4%|▍ | 2/50 [00:00<00:13, 3.67it/s] 6%|▌ | 3/50 [00:00<00:12, 3.67it/s] 8%|▊ | 4/50 [00:01<00:12, 3.67it/s] 10%|█ | 5/50 [00:01<00:12, 3.67it/s] 12%|█▏ | 6/50 [00:01<00:11, 3.68it/s] 14%|█▍ | 7/50 [00:01<00:11, 3.67it/s] 16%|█▌ | 8/50 [00:02<00:11, 3.67it/s] 18%|█▊ | 9/50 [00:02<00:11, 3.67it/s] 20%|██ | 10/50 [00:02<00:10, 3.67it/s] 22%|██▏ | 11/50 [00:02<00:10, 3.67it/s] 24%|██▍ | 12/50 [00:03<00:10, 3.67it/s] 26%|██▌ | 13/50 [00:03<00:10, 3.67it/s] 28%|██▊ | 14/50 [00:03<00:09, 3.67it/s] 30%|███ | 15/50 [00:04<00:09, 3.67it/s] 32%|███▏ | 16/50 [00:04<00:09, 3.66it/s] 34%|███▍ | 17/50 [00:04<00:09, 3.66it/s] 36%|███▌ | 18/50 [00:04<00:08, 3.66it/s] 38%|███▊ | 19/50 [00:05<00:08, 3.66it/s] 40%|████ | 20/50 [00:05<00:08, 3.66it/s] 42%|████▏ | 21/50 [00:05<00:07, 3.66it/s] 44%|████▍ | 22/50 [00:06<00:07, 3.65it/s] 46%|████▌ | 23/50 [00:06<00:07, 3.65it/s] 48%|████▊ | 24/50 [00:06<00:07, 3.65it/s] 50%|█████ | 25/50 [00:06<00:06, 3.65it/s] 52%|█████▏ | 26/50 [00:07<00:06, 3.65it/s] 54%|█████▍ | 27/50 [00:07<00:06, 3.65it/s] 56%|█████▌ | 28/50 [00:07<00:06, 3.66it/s] 58%|█████▊ | 29/50 [00:07<00:05, 3.66it/s] 60%|██████ | 30/50 [00:08<00:05, 3.66it/s] 62%|██████▏ | 31/50 [00:08<00:05, 3.66it/s] 64%|██████▍ | 32/50 [00:08<00:04, 3.65it/s] 66%|██████▌ | 33/50 [00:09<00:04, 3.65it/s] 68%|██████▊ | 34/50 [00:09<00:04, 3.65it/s] 70%|███████ | 35/50 [00:09<00:04, 3.65it/s] 72%|███████▏ | 36/50 [00:09<00:03, 3.65it/s] 74%|███████▍ | 37/50 [00:10<00:03, 3.65it/s] 76%|███████▌ | 38/50 [00:10<00:03, 3.65it/s] 78%|███████▊ | 39/50 [00:10<00:03, 3.65it/s] 80%|████████ | 40/50 [00:10<00:02, 3.65it/s] 82%|████████▏ | 41/50 [00:11<00:02, 3.65it/s] 84%|████████▍ | 42/50 [00:11<00:02, 3.65it/s] 86%|████████▌ | 43/50 [00:11<00:01, 3.65it/s] 88%|████████▊ | 44/50 [00:12<00:01, 3.65it/s] 90%|█████████ | 45/50 [00:12<00:01, 3.65it/s] 92%|█████████▏| 46/50 [00:12<00:01, 3.65it/s] 94%|█████████▍| 47/50 [00:12<00:00, 3.65it/s] 96%|█████████▌| 48/50 [00:13<00:00, 3.65it/s] 98%|█████████▊| 49/50 [00:13<00:00, 3.65it/s] 100%|██████████| 50/50 [00:13<00:00, 3.65it/s] 100%|██████████| 50/50 [00:13<00:00, 3.66it/s]
Prediction
linoytsaban/linoy_lora:f1d12abe0463bff6435c7b7a4739ed565e2a28cf56f0c4a78ed01f8670727845IDi7al6xtbxreznpn2b2cxf2jedaStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
- width
- 1024
- height
- 1024
- prompt
- a hugging face emoji in the style of TOK, dressed as yoda
- refine
- no_refiner
- scheduler
- K_EULER
- lora_scale
- 0.7
- num_outputs
- 1
- guidance_scale
- 7.5
- apply_watermark
- high_noise_frac
- 0.8
- negative_prompt
- prompt_strength
- 0.8
- num_inference_steps
- 50
{ "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, dressed as yoda", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.7, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run linoytsaban/linoy_lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "linoytsaban/linoy_lora:f1d12abe0463bff6435c7b7a4739ed565e2a28cf56f0c4a78ed01f8670727845", { input: { width: 1024, height: 1024, prompt: "a hugging face emoji in the style of TOK, dressed as yoda", refine: "no_refiner", scheduler: "K_EULER", lora_scale: 0.7, num_outputs: 1, guidance_scale: 7.5, apply_watermark: true, high_noise_frac: 0.8, negative_prompt: "", prompt_strength: 0.8, num_inference_steps: 50 } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run linoytsaban/linoy_lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "linoytsaban/linoy_lora:f1d12abe0463bff6435c7b7a4739ed565e2a28cf56f0c4a78ed01f8670727845", input={ "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, dressed as yoda", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.7, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": True, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 } ) # To access the file URL: print(output[0].url()) #=> "http://example.com" # To write the file to disk: with open("my-image.png", "wb") as file: file.write(output[0].read())
To learn more, take a look at the guide on getting started with Python.
Run linoytsaban/linoy_lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "linoytsaban/linoy_lora:f1d12abe0463bff6435c7b7a4739ed565e2a28cf56f0c4a78ed01f8670727845", "input": { "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, dressed as yoda", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.7, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/linoytsaban/linoy_lora@sha256:f1d12abe0463bff6435c7b7a4739ed565e2a28cf56f0c4a78ed01f8670727845 \ -i 'width=1024' \ -i 'height=1024' \ -i 'prompt="a hugging face emoji in the style of TOK, dressed as yoda"' \ -i 'refine="no_refiner"' \ -i 'scheduler="K_EULER"' \ -i 'lora_scale=0.7' \ -i 'num_outputs=1' \ -i 'guidance_scale=7.5' \ -i 'apply_watermark=true' \ -i 'high_noise_frac=0.8' \ -i 'negative_prompt=""' \ -i 'prompt_strength=0.8' \ -i 'num_inference_steps=50'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/linoytsaban/linoy_lora@sha256:f1d12abe0463bff6435c7b7a4739ed565e2a28cf56f0c4a78ed01f8670727845
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, dressed as yoda", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.7, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2023-09-21T08:48:11.905697Z", "created_at": "2023-09-21T08:47:57.114152Z", "data_removed": false, "error": null, "id": "i7al6xtbxreznpn2b2cxf2jeda", "input": { "width": 1024, "height": 1024, "prompt": "a hugging face emoji in the style of TOK, dressed as yoda", "refine": "no_refiner", "scheduler": "K_EULER", "lora_scale": 0.7, "num_outputs": 1, "guidance_scale": 7.5, "apply_watermark": true, "high_noise_frac": 0.8, "negative_prompt": "", "prompt_strength": 0.8, "num_inference_steps": 50 }, "logs": "Using seed: 40778\nPrompt: a hugging face emoji in the style of <s0><s1>, dressed as yoda\ntxt2img mode\n 0%| | 0/50 [00:00<?, ?it/s]\n 2%|▏ | 1/50 [00:00<00:13, 3.71it/s]\n 4%|▍ | 2/50 [00:00<00:12, 3.70it/s]\n 6%|▌ | 3/50 [00:00<00:12, 3.70it/s]\n 8%|▊ | 4/50 [00:01<00:12, 3.70it/s]\n 10%|█ | 5/50 [00:01<00:12, 3.70it/s]\n 12%|█▏ | 6/50 [00:01<00:11, 3.70it/s]\n 14%|█▍ | 7/50 [00:01<00:11, 3.71it/s]\n 16%|█▌ | 8/50 [00:02<00:11, 3.70it/s]\n 18%|█▊ | 9/50 [00:02<00:11, 3.70it/s]\n 20%|██ | 10/50 [00:02<00:10, 3.70it/s]\n 22%|██▏ | 11/50 [00:02<00:10, 3.70it/s]\n 24%|██▍ | 12/50 [00:03<00:10, 3.70it/s]\n 26%|██▌ | 13/50 [00:03<00:10, 3.70it/s]\n 28%|██▊ | 14/50 [00:03<00:09, 3.70it/s]\n 30%|███ | 15/50 [00:04<00:09, 3.70it/s]\n 32%|███▏ | 16/50 [00:04<00:09, 3.69it/s]\n 34%|███▍ | 17/50 [00:04<00:08, 3.69it/s]\n 36%|███▌ | 18/50 [00:04<00:08, 3.69it/s]\n 38%|███▊ | 19/50 [00:05<00:08, 3.69it/s]\n 40%|████ | 20/50 [00:05<00:08, 3.69it/s]\n 42%|████▏ | 21/50 [00:05<00:07, 3.68it/s]\n 44%|████▍ | 22/50 [00:05<00:07, 3.68it/s]\n 46%|████▌ | 23/50 [00:06<00:07, 3.68it/s]\n 48%|████▊ | 24/50 [00:06<00:07, 3.68it/s]\n 50%|█████ | 25/50 [00:06<00:06, 3.68it/s]\n 52%|█████▏ | 26/50 [00:07<00:06, 3.68it/s]\n 54%|█████▍ | 27/50 [00:07<00:06, 3.68it/s]\n 56%|█████▌ | 28/50 [00:07<00:05, 3.68it/s]\n 58%|█████▊ | 29/50 [00:07<00:05, 3.68it/s]\n 60%|██████ | 30/50 [00:08<00:05, 3.68it/s]\n 62%|██████▏ | 31/50 [00:08<00:05, 3.68it/s]\n 64%|██████▍ | 32/50 [00:08<00:04, 3.68it/s]\n 66%|██████▌ | 33/50 [00:08<00:04, 3.68it/s]\n 68%|██████▊ | 34/50 [00:09<00:04, 3.68it/s]\n 70%|███████ | 35/50 [00:09<00:04, 3.68it/s]\n 72%|███████▏ | 36/50 [00:09<00:03, 3.68it/s]\n 74%|███████▍ | 37/50 [00:10<00:03, 3.68it/s]\n 76%|███████▌ | 38/50 [00:10<00:03, 3.68it/s]\n 78%|███████▊ | 39/50 [00:10<00:02, 3.67it/s]\n 80%|████████ | 40/50 [00:10<00:02, 3.68it/s]\n 82%|████████▏ | 41/50 [00:11<00:02, 3.68it/s]\n 84%|████████▍ | 42/50 [00:11<00:02, 3.68it/s]\n 86%|████████▌ | 43/50 [00:11<00:01, 3.68it/s]\n 88%|████████▊ | 44/50 [00:11<00:01, 3.67it/s]\n 90%|█████████ | 45/50 [00:12<00:01, 3.67it/s]\n 92%|█████████▏| 46/50 [00:12<00:01, 3.67it/s]\n 94%|█████████▍| 47/50 [00:12<00:00, 3.67it/s]\n 96%|█████████▌| 48/50 [00:13<00:00, 3.67it/s]\n 98%|█████████▊| 49/50 [00:13<00:00, 3.67it/s]\n100%|██████████| 50/50 [00:13<00:00, 3.67it/s]\n100%|██████████| 50/50 [00:13<00:00, 3.68it/s]", "metrics": { "predict_time": 14.858614, "total_time": 14.791545 }, "output": [ "https://replicate.delivery/pbxt/mJItrAYYfkyTPiXTFlBJyqZqz5c1lHq5xTmhwgNTBNvlqNzIA/out-0.png" ], "started_at": "2023-09-21T08:47:57.047083Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/i7al6xtbxreznpn2b2cxf2jeda", "cancel": "https://api.replicate.com/v1/predictions/i7al6xtbxreznpn2b2cxf2jeda/cancel" }, "version": "f1d12abe0463bff6435c7b7a4739ed565e2a28cf56f0c4a78ed01f8670727845" }
Generated inUsing seed: 40778 Prompt: a hugging face emoji in the style of <s0><s1>, dressed as yoda txt2img mode 0%| | 0/50 [00:00<?, ?it/s] 2%|▏ | 1/50 [00:00<00:13, 3.71it/s] 4%|▍ | 2/50 [00:00<00:12, 3.70it/s] 6%|▌ | 3/50 [00:00<00:12, 3.70it/s] 8%|▊ | 4/50 [00:01<00:12, 3.70it/s] 10%|█ | 5/50 [00:01<00:12, 3.70it/s] 12%|█▏ | 6/50 [00:01<00:11, 3.70it/s] 14%|█▍ | 7/50 [00:01<00:11, 3.71it/s] 16%|█▌ | 8/50 [00:02<00:11, 3.70it/s] 18%|█▊ | 9/50 [00:02<00:11, 3.70it/s] 20%|██ | 10/50 [00:02<00:10, 3.70it/s] 22%|██▏ | 11/50 [00:02<00:10, 3.70it/s] 24%|██▍ | 12/50 [00:03<00:10, 3.70it/s] 26%|██▌ | 13/50 [00:03<00:10, 3.70it/s] 28%|██▊ | 14/50 [00:03<00:09, 3.70it/s] 30%|███ | 15/50 [00:04<00:09, 3.70it/s] 32%|███▏ | 16/50 [00:04<00:09, 3.69it/s] 34%|███▍ | 17/50 [00:04<00:08, 3.69it/s] 36%|███▌ | 18/50 [00:04<00:08, 3.69it/s] 38%|███▊ | 19/50 [00:05<00:08, 3.69it/s] 40%|████ | 20/50 [00:05<00:08, 3.69it/s] 42%|████▏ | 21/50 [00:05<00:07, 3.68it/s] 44%|████▍ | 22/50 [00:05<00:07, 3.68it/s] 46%|████▌ | 23/50 [00:06<00:07, 3.68it/s] 48%|████▊ | 24/50 [00:06<00:07, 3.68it/s] 50%|█████ | 25/50 [00:06<00:06, 3.68it/s] 52%|█████▏ | 26/50 [00:07<00:06, 3.68it/s] 54%|█████▍ | 27/50 [00:07<00:06, 3.68it/s] 56%|█████▌ | 28/50 [00:07<00:05, 3.68it/s] 58%|█████▊ | 29/50 [00:07<00:05, 3.68it/s] 60%|██████ | 30/50 [00:08<00:05, 3.68it/s] 62%|██████▏ | 31/50 [00:08<00:05, 3.68it/s] 64%|██████▍ | 32/50 [00:08<00:04, 3.68it/s] 66%|██████▌ | 33/50 [00:08<00:04, 3.68it/s] 68%|██████▊ | 34/50 [00:09<00:04, 3.68it/s] 70%|███████ | 35/50 [00:09<00:04, 3.68it/s] 72%|███████▏ | 36/50 [00:09<00:03, 3.68it/s] 74%|███████▍ | 37/50 [00:10<00:03, 3.68it/s] 76%|███████▌ | 38/50 [00:10<00:03, 3.68it/s] 78%|███████▊ | 39/50 [00:10<00:02, 3.67it/s] 80%|████████ | 40/50 [00:10<00:02, 3.68it/s] 82%|████████▏ | 41/50 [00:11<00:02, 3.68it/s] 84%|████████▍ | 42/50 [00:11<00:02, 3.68it/s] 86%|████████▌ | 43/50 [00:11<00:01, 3.68it/s] 88%|████████▊ | 44/50 [00:11<00:01, 3.67it/s] 90%|█████████ | 45/50 [00:12<00:01, 3.67it/s] 92%|█████████▏| 46/50 [00:12<00:01, 3.67it/s] 94%|█████████▍| 47/50 [00:12<00:00, 3.67it/s] 96%|█████████▌| 48/50 [00:13<00:00, 3.67it/s] 98%|█████████▊| 49/50 [00:13<00:00, 3.67it/s] 100%|██████████| 50/50 [00:13<00:00, 3.67it/s] 100%|██████████| 50/50 [00:13<00:00, 3.68it/s]
Want to make some of these yourself?
Run this model