sandrop8 / flux1-schnell-multi-lora
Adapted to have multi-lora support also for schnell: https://replicate.com/lucataco/flux-dev-multi-lora
- Public
- 2.5K runs
-
A100 (80GB)
Prediction
sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68IDf7rqmrn7q9rj00cjyc9sf4f0wrStatusSucceededSourceWebHardwareA100 (80GB)Total durationCreatedInput
- seed
- 43
- prompt
- a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y
- hf_loras
- [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ]
- lora_scales
- [ 0.8, 0.9 ]
- num_outputs
- 1
- aspect_ratio
- 1:1
- output_format
- webp
- guidance_scale
- 0
- output_quality
- 100
- prompt_strength
- 0.8
- num_inference_steps
- 4
{ "seed": 43, "prompt": "a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y", "hf_loras": [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ], "lora_scales": [ 0.8, 0.9 ], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", { input: { seed: 43, prompt: "a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y", hf_loras: ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], lora_scales: [0.8,0.9], num_outputs: 1, aspect_ratio: "1:1", output_format: "webp", guidance_scale: 0, output_quality: 100, prompt_strength: 0.8, num_inference_steps: 4 } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", input={ "seed": 43, "prompt": "a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y", "hf_loras": ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], "lora_scales": [0.8,0.9], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", "input": { "seed": 43, "prompt": "a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y", "hf_loras": ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], "lora_scales": [0.8,0.9], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
{ "completed_at": "2024-11-03T17:30:10.925995Z", "created_at": "2024-11-03T17:30:04.602000Z", "data_removed": false, "error": null, "id": "f7rqmrn7q9rj00cjyc9sf4f0wr", "input": { "seed": 43, "prompt": "a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y", "hf_loras": [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ], "lora_scales": [ 0.8, 0.9 ], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 }, "logs": "Using seed: 43\nPrompt: a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y\ntxt2img mode\nDownloading LoRA weights from - HF URL: https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors\nHuggingFace slug from URL: Octree/flux-schnell-lora, weight name: flux-schnell-lora.safetensors\nLoading LoRA took: 0.68 seconds\nDownloading LoRA weights from - HF URL: https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors\nHuggingFace slug from URL: hugovntr/flux-schnell-realism, weight name: schnell-realism_v1.safetensors\nUnsuppored keys for ai-toolkit: dict_keys(['lora_te1_text_model_encoder_layers_0_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_0_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_0_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_10_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_10_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_11_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_11_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_1_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_1_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_2_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_2_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_3_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_3_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_4_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_4_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_5_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_5_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_6_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_6_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_7_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_7_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_8_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_8_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_up.weight'])\nLoading LoRA took: 2.27 seconds\n 0%| | 0/4 [00:00<?, ?it/s]\n 25%|██▌ | 1/4 [00:00<00:01, 1.85it/s]\n 50%|█████ | 2/4 [00:00<00:00, 2.45it/s]\n 75%|███████▌ | 3/4 [00:01<00:00, 2.26it/s]\n100%|██████████| 4/4 [00:01<00:00, 2.18it/s]\n100%|██████████| 4/4 [00:01<00:00, 2.19it/s]", "metrics": { "predict_time": 6.315119646, "total_time": 6.323995 }, "output": [ "https://replicate.delivery/yhqm/YPC4ZAAnzxamDtTFLBsOrXUsfYDfBib2bnzVNtd3KMwiUWtTA/out-0.webp" ], "started_at": "2024-11-03T17:30:04.610875Z", "status": "succeeded", "urls": { "stream": "https://stream.replicate.com/v1/files/qoxq-wyjnobksnzmjuo6ymvhcuykdid3xkhg4iqksrxqiky3qp5mcflxq", "get": "https://api.replicate.com/v1/predictions/f7rqmrn7q9rj00cjyc9sf4f0wr", "cancel": "https://api.replicate.com/v1/predictions/f7rqmrn7q9rj00cjyc9sf4f0wr/cancel" }, "version": "1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68" }
Generated inUsing seed: 43 Prompt: a beautiful scandinavian girl posing in the mountains, the sun is shining, the grass is green and flowers are everywhere, she is wearing a patterned colorful top. be4u7y txt2img mode Downloading LoRA weights from - HF URL: https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors HuggingFace slug from URL: Octree/flux-schnell-lora, weight name: flux-schnell-lora.safetensors Loading LoRA took: 0.68 seconds Downloading LoRA weights from - HF URL: https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors HuggingFace slug from URL: hugovntr/flux-schnell-realism, weight name: schnell-realism_v1.safetensors Unsuppored keys for ai-toolkit: dict_keys(['lora_te1_text_model_encoder_layers_0_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_0_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_0_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_0_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_0_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_10_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_10_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_10_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_10_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_11_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_11_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_11_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_11_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_1_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_1_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_1_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_1_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_2_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_2_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_2_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_2_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_3_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_3_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_3_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_3_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_4_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_4_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_4_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_4_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_5_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_5_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_5_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_5_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_6_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_6_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_6_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_6_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_7_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_7_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_7_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_7_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_8_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_8_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_8_self_attn_v_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc1.alpha', 'lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc1.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc2.alpha', 'lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_mlp_fc2.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_k_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_k_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_out_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_out_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_q_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_q_proj.lora_up.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_v_proj.alpha', 'lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_down.weight', 'lora_te1_text_model_encoder_layers_9_self_attn_v_proj.lora_up.weight']) Loading LoRA took: 2.27 seconds 0%| | 0/4 [00:00<?, ?it/s] 25%|██▌ | 1/4 [00:00<00:01, 1.85it/s] 50%|█████ | 2/4 [00:00<00:00, 2.45it/s] 75%|███████▌ | 3/4 [00:01<00:00, 2.26it/s] 100%|██████████| 4/4 [00:01<00:00, 2.18it/s] 100%|██████████| 4/4 [00:01<00:00, 2.19it/s]
Prediction
sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68ID8c5fh2gd7xrj20cjycab4592srStatusSucceededSourceWebHardwareA100 (80GB)Total durationCreatedInput
- seed
- 43
- prompt
- A handsome Scandinavian man standing by a tranquil lake, with pine trees and mountains in the background. His blond hair catches the light, and he has a soft smile. Dressed in a cozy flannel shirt, he exudes a rugged yet calm presence. be4u7y
- hf_loras
- [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ]
- lora_scales
- [ 0.8, 0.9 ]
- num_outputs
- 1
- aspect_ratio
- 1:1
- output_format
- webp
- guidance_scale
- 0
- output_quality
- 100
- prompt_strength
- 0.8
- num_inference_steps
- 4
{ "seed": 43, "prompt": "A handsome Scandinavian man standing by a tranquil lake, with pine trees and mountains in the background. His blond hair catches the light, and he has a soft smile. Dressed in a cozy flannel shirt, he exudes a rugged yet calm presence. be4u7y", "hf_loras": [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ], "lora_scales": [ 0.8, 0.9 ], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", { input: { seed: 43, prompt: "A handsome Scandinavian man standing by a tranquil lake, with pine trees and mountains in the background. His blond hair catches the light, and he has a soft smile. Dressed in a cozy flannel shirt, he exudes a rugged yet calm presence. be4u7y", hf_loras: ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], lora_scales: [0.8,0.9], num_outputs: 1, aspect_ratio: "1:1", output_format: "webp", guidance_scale: 0, output_quality: 100, prompt_strength: 0.8, num_inference_steps: 4 } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", input={ "seed": 43, "prompt": "A handsome Scandinavian man standing by a tranquil lake, with pine trees and mountains in the background. His blond hair catches the light, and he has a soft smile. Dressed in a cozy flannel shirt, he exudes a rugged yet calm presence. be4u7y", "hf_loras": ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], "lora_scales": [0.8,0.9], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", "input": { "seed": 43, "prompt": "A handsome Scandinavian man standing by a tranquil lake, with pine trees and mountains in the background. His blond hair catches the light, and he has a soft smile. Dressed in a cozy flannel shirt, he exudes a rugged yet calm presence. be4u7y", "hf_loras": ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], "lora_scales": [0.8,0.9], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
{ "completed_at": "2024-11-03T17:30:32.977574Z", "created_at": "2024-11-03T17:30:30.591000Z", "data_removed": false, "error": null, "id": "8c5fh2gd7xrj20cjycab4592sr", "input": { "seed": 43, "prompt": "A handsome Scandinavian man standing by a tranquil lake, with pine trees and mountains in the background. His blond hair catches the light, and he has a soft smile. Dressed in a cozy flannel shirt, he exudes a rugged yet calm presence. be4u7y", "hf_loras": [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ], "lora_scales": [ 0.8, 0.9 ], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 }, "logs": "Using seed: 43\nPrompt: A handsome Scandinavian man standing by a tranquil lake, with pine trees and mountains in the background. His blond hair catches the light, and he has a soft smile. Dressed in a cozy flannel shirt, he exudes a rugged yet calm presence. be4u7y\ntxt2img mode\n 0%| | 0/4 [00:00<?, ?it/s]\n 25%|██▌ | 1/4 [00:00<00:01, 2.08it/s]\n 50%|█████ | 2/4 [00:00<00:00, 2.60it/s]\n 75%|███████▌ | 3/4 [00:01<00:00, 2.33it/s]\n100%|██████████| 4/4 [00:01<00:00, 2.22it/s]\n100%|██████████| 4/4 [00:01<00:00, 2.27it/s]", "metrics": { "predict_time": 2.377082176, "total_time": 2.386574 }, "output": [ "https://replicate.delivery/yhqm/5mAzO9LxLfVYC6NiEfnvIg5lC4vAHAhiJf3yAlZf5RZhTZ1OB/out-0.webp" ], "started_at": "2024-11-03T17:30:30.600492Z", "status": "succeeded", "urls": { "stream": "https://stream.replicate.com/v1/files/qoxq-6itx2jwrkwwdvtfajcb2byhrpyebkliym7xvaunodjolildtabgq", "get": "https://api.replicate.com/v1/predictions/8c5fh2gd7xrj20cjycab4592sr", "cancel": "https://api.replicate.com/v1/predictions/8c5fh2gd7xrj20cjycab4592sr/cancel" }, "version": "1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68" }
Generated inUsing seed: 43 Prompt: A handsome Scandinavian man standing by a tranquil lake, with pine trees and mountains in the background. His blond hair catches the light, and he has a soft smile. Dressed in a cozy flannel shirt, he exudes a rugged yet calm presence. be4u7y txt2img mode 0%| | 0/4 [00:00<?, ?it/s] 25%|██▌ | 1/4 [00:00<00:01, 2.08it/s] 50%|█████ | 2/4 [00:00<00:00, 2.60it/s] 75%|███████▌ | 3/4 [00:01<00:00, 2.33it/s] 100%|██████████| 4/4 [00:01<00:00, 2.22it/s] 100%|██████████| 4/4 [00:01<00:00, 2.27it/s]
Prediction
sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68IDpxsnpmjn7drj60cjycaa9yqhjcStatusSucceededSourceWebHardwareA100 (80GB)Total durationCreatedInput
- seed
- 43
- prompt
- An attractive Brazilian man on the beach at dawn, with gentle waves behind him. His relaxed posture and toned physique are highlighted by the early morning light, and he wears a simple tank top with a surfboard under his arm. be4u7y
- hf_loras
- [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ]
- lora_scales
- [ 0.8, 0.9 ]
- num_outputs
- 1
- aspect_ratio
- 1:1
- output_format
- webp
- guidance_scale
- 0
- output_quality
- 100
- prompt_strength
- 0.8
- num_inference_steps
- 4
{ "seed": 43, "prompt": "An attractive Brazilian man on the beach at dawn, with gentle waves behind him. His relaxed posture and toned physique are highlighted by the early morning light, and he wears a simple tank top with a surfboard under his arm. be4u7y", "hf_loras": [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ], "lora_scales": [ 0.8, 0.9 ], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", { input: { seed: 43, prompt: "An attractive Brazilian man on the beach at dawn, with gentle waves behind him. His relaxed posture and toned physique are highlighted by the early morning light, and he wears a simple tank top with a surfboard under his arm. be4u7y", hf_loras: ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], lora_scales: [0.8,0.9], num_outputs: 1, aspect_ratio: "1:1", output_format: "webp", guidance_scale: 0, output_quality: 100, prompt_strength: 0.8, num_inference_steps: 4 } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", input={ "seed": 43, "prompt": "An attractive Brazilian man on the beach at dawn, with gentle waves behind him. His relaxed posture and toned physique are highlighted by the early morning light, and he wears a simple tank top with a surfboard under his arm. be4u7y", "hf_loras": ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], "lora_scales": [0.8,0.9], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", "input": { "seed": 43, "prompt": "An attractive Brazilian man on the beach at dawn, with gentle waves behind him. His relaxed posture and toned physique are highlighted by the early morning light, and he wears a simple tank top with a surfboard under his arm. be4u7y", "hf_loras": ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], "lora_scales": [0.8,0.9], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
{ "completed_at": "2024-11-03T17:30:51.394383Z", "created_at": "2024-11-03T17:30:49.019000Z", "data_removed": false, "error": null, "id": "pxsnpmjn7drj60cjycaa9yqhjc", "input": { "seed": 43, "prompt": "An attractive Brazilian man on the beach at dawn, with gentle waves behind him. His relaxed posture and toned physique are highlighted by the early morning light, and he wears a simple tank top with a surfboard under his arm. be4u7y", "hf_loras": [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ], "lora_scales": [ 0.8, 0.9 ], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 }, "logs": "Using seed: 43\nPrompt: An attractive Brazilian man on the beach at dawn, with gentle waves behind him. His relaxed posture and toned physique are highlighted by the early morning light, and he wears a simple tank top with a surfboard under his arm. be4u7y\ntxt2img mode\n 0%| | 0/4 [00:00<?, ?it/s]\n 25%|██▌ | 1/4 [00:00<00:01, 2.08it/s]\n 50%|█████ | 2/4 [00:00<00:00, 2.60it/s]\n 75%|███████▌ | 3/4 [00:01<00:00, 2.32it/s]\n100%|██████████| 4/4 [00:01<00:00, 2.22it/s]\n100%|██████████| 4/4 [00:01<00:00, 2.26it/s]", "metrics": { "predict_time": 2.366679003, "total_time": 2.375383 }, "output": [ "https://replicate.delivery/yhqm/4qESNWgng4IyCtLZ8WebqmtZ4gcXTgCGeoBf57K3HzOWqsanA/out-0.webp" ], "started_at": "2024-11-03T17:30:49.027704Z", "status": "succeeded", "urls": { "stream": "https://stream.replicate.com/v1/files/qoxq-5xn7w7z2eo6rckkaesj4gm5soyon2nemdup6fp52s6kcl7qtsm5a", "get": "https://api.replicate.com/v1/predictions/pxsnpmjn7drj60cjycaa9yqhjc", "cancel": "https://api.replicate.com/v1/predictions/pxsnpmjn7drj60cjycaa9yqhjc/cancel" }, "version": "1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68" }
Generated inUsing seed: 43 Prompt: An attractive Brazilian man on the beach at dawn, with gentle waves behind him. His relaxed posture and toned physique are highlighted by the early morning light, and he wears a simple tank top with a surfboard under his arm. be4u7y txt2img mode 0%| | 0/4 [00:00<?, ?it/s] 25%|██▌ | 1/4 [00:00<00:01, 2.08it/s] 50%|█████ | 2/4 [00:00<00:00, 2.60it/s] 75%|███████▌ | 3/4 [00:01<00:00, 2.32it/s] 100%|██████████| 4/4 [00:01<00:00, 2.22it/s] 100%|██████████| 4/4 [00:01<00:00, 2.26it/s]
Prediction
sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68ID2f884xqvs5rj40cjycabd573j8StatusSucceededSourceWebHardwareA100 (80GB)Total durationCreatedInput
- seed
- 434
- prompt
- A lively Australian woman laughing under a bright blue sky, standing on a sandy beach. Her sun-kissed skin and windswept hair convey a sense of freedom, and she’s wearing a floral tank top and beaded jewelry. be4u7y
- hf_loras
- [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ]
- lora_scales
- [ 0.8, 0.9 ]
- num_outputs
- 1
- aspect_ratio
- 1:1
- output_format
- webp
- guidance_scale
- 0
- output_quality
- 100
- prompt_strength
- 0.8
- num_inference_steps
- 4
{ "seed": 434, "prompt": "A lively Australian woman laughing under a bright blue sky, standing on a sandy beach. Her sun-kissed skin and windswept hair convey a sense of freedom, and she’s wearing a floral tank top and beaded jewelry. be4u7y", "hf_loras": [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ], "lora_scales": [ 0.8, 0.9 ], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", { input: { seed: 434, prompt: "A lively Australian woman laughing under a bright blue sky, standing on a sandy beach. Her sun-kissed skin and windswept hair convey a sense of freedom, and she’s wearing a floral tank top and beaded jewelry. be4u7y", hf_loras: ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], lora_scales: [0.8,0.9], num_outputs: 1, aspect_ratio: "1:1", output_format: "webp", guidance_scale: 0, output_quality: 100, prompt_strength: 0.8, num_inference_steps: 4 } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", input={ "seed": 434, "prompt": "A lively Australian woman laughing under a bright blue sky, standing on a sandy beach. Her sun-kissed skin and windswept hair convey a sense of freedom, and she’s wearing a floral tank top and beaded jewelry. be4u7y", "hf_loras": ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], "lora_scales": [0.8,0.9], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", "input": { "seed": 434, "prompt": "A lively Australian woman laughing under a bright blue sky, standing on a sandy beach. Her sun-kissed skin and windswept hair convey a sense of freedom, and she’s wearing a floral tank top and beaded jewelry. be4u7y", "hf_loras": ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], "lora_scales": [0.8,0.9], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
{ "completed_at": "2024-11-03T17:31:34.008664Z", "created_at": "2024-11-03T17:31:31.657000Z", "data_removed": false, "error": null, "id": "2f884xqvs5rj40cjycabd573j8", "input": { "seed": 434, "prompt": "A lively Australian woman laughing under a bright blue sky, standing on a sandy beach. Her sun-kissed skin and windswept hair convey a sense of freedom, and she’s wearing a floral tank top and beaded jewelry. be4u7y", "hf_loras": [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ], "lora_scales": [ 0.8, 0.9 ], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 }, "logs": "Using seed: 434\nPrompt: A lively Australian woman laughing under a bright blue sky, standing on a sandy beach. Her sun-kissed skin and windswept hair convey a sense of freedom, and she’s wearing a floral tank top and beaded jewelry. be4u7y\ntxt2img mode\n 0%| | 0/4 [00:00<?, ?it/s]\n 25%|██▌ | 1/4 [00:00<00:01, 2.08it/s]\n 50%|█████ | 2/4 [00:00<00:00, 2.61it/s]\n 75%|███████▌ | 3/4 [00:01<00:00, 2.33it/s]\n100%|██████████| 4/4 [00:01<00:00, 2.22it/s]\n100%|██████████| 4/4 [00:01<00:00, 2.27it/s]", "metrics": { "predict_time": 2.342853166, "total_time": 2.351664 }, "output": [ "https://replicate.delivery/yhqm/qToKUmwTqI6ACFz6JWV3ctmo4eDGLd13ltk3ic27rME7Kr2JA/out-0.webp" ], "started_at": "2024-11-03T17:31:31.665811Z", "status": "succeeded", "urls": { "stream": "https://stream.replicate.com/v1/files/qoxq-r3xsouzekr2bmzywcqsenwehw4hfy6mf7cfyix355v7hrrhmfi5q", "get": "https://api.replicate.com/v1/predictions/2f884xqvs5rj40cjycabd573j8", "cancel": "https://api.replicate.com/v1/predictions/2f884xqvs5rj40cjycabd573j8/cancel" }, "version": "1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68" }
Generated inUsing seed: 434 Prompt: A lively Australian woman laughing under a bright blue sky, standing on a sandy beach. Her sun-kissed skin and windswept hair convey a sense of freedom, and she’s wearing a floral tank top and beaded jewelry. be4u7y txt2img mode 0%| | 0/4 [00:00<?, ?it/s] 25%|██▌ | 1/4 [00:00<00:01, 2.08it/s] 50%|█████ | 2/4 [00:00<00:00, 2.61it/s] 75%|███████▌ | 3/4 [00:01<00:00, 2.33it/s] 100%|██████████| 4/4 [00:01<00:00, 2.22it/s] 100%|██████████| 4/4 [00:01<00:00, 2.27it/s]
Prediction
sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68IDnj6x3vv075rj40cjycatsptzswStatusSucceededSourceWebHardwareA100 (80GB)Total durationCreatedInput
- seed
- 434
- prompt
- An elegant Japanese woman in a serene cherry blossom garden, delicate petals floating around her as she gazes at the camera with a gentle smile. She wears a traditional silk kimono with intricate floral patterns, her dark hair styled into a classic bun.. be4u7y
- hf_loras
- [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ]
- lora_scales
- [ 0.8, 0.9 ]
- num_outputs
- 1
- aspect_ratio
- 1:1
- output_format
- webp
- guidance_scale
- 0
- output_quality
- 100
- prompt_strength
- 0.8
- num_inference_steps
- 4
{ "seed": 434, "prompt": "An elegant Japanese woman in a serene cherry blossom garden, delicate petals floating around her as she gazes at the camera with a gentle smile. She wears a traditional silk kimono with intricate floral patterns, her dark hair styled into a classic bun.. be4u7y", "hf_loras": [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ], "lora_scales": [ 0.8, 0.9 ], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", { input: { seed: 434, prompt: "An elegant Japanese woman in a serene cherry blossom garden, delicate petals floating around her as she gazes at the camera with a gentle smile. She wears a traditional silk kimono with intricate floral patterns, her dark hair styled into a classic bun.. be4u7y", hf_loras: ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], lora_scales: [0.8,0.9], num_outputs: 1, aspect_ratio: "1:1", output_format: "webp", guidance_scale: 0, output_quality: 100, prompt_strength: 0.8, num_inference_steps: 4 } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", input={ "seed": 434, "prompt": "An elegant Japanese woman in a serene cherry blossom garden, delicate petals floating around her as she gazes at the camera with a gentle smile. She wears a traditional silk kimono with intricate floral patterns, her dark hair styled into a classic bun.. be4u7y", "hf_loras": ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], "lora_scales": [0.8,0.9], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run sandrop8/flux1-schnell-multi-lora using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "sandrop8/flux1-schnell-multi-lora:1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68", "input": { "seed": 434, "prompt": "An elegant Japanese woman in a serene cherry blossom garden, delicate petals floating around her as she gazes at the camera with a gentle smile. She wears a traditional silk kimono with intricate floral patterns, her dark hair styled into a classic bun.. be4u7y", "hf_loras": ["https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors","https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors"], "lora_scales": [0.8,0.9], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
{ "completed_at": "2024-11-03T17:31:59.749642Z", "created_at": "2024-11-03T17:31:57.369000Z", "data_removed": false, "error": null, "id": "nj6x3vv075rj40cjycatsptzsw", "input": { "seed": 434, "prompt": "An elegant Japanese woman in a serene cherry blossom garden, delicate petals floating around her as she gazes at the camera with a gentle smile. She wears a traditional silk kimono with intricate floral patterns, her dark hair styled into a classic bun.. be4u7y", "hf_loras": [ "https://huggingface.co/Octree/flux-schnell-lora/resolve/main/flux-schnell-lora.safetensors", "https://huggingface.co/hugovntr/flux-schnell-realism/resolve/main/schnell-realism_v1.safetensors" ], "lora_scales": [ 0.8, 0.9 ], "num_outputs": 1, "aspect_ratio": "1:1", "output_format": "webp", "guidance_scale": 0, "output_quality": 100, "prompt_strength": 0.8, "num_inference_steps": 4 }, "logs": "Using seed: 434\nPrompt: An elegant Japanese woman in a serene cherry blossom garden, delicate petals floating around her as she gazes at the camera with a gentle smile. She wears a traditional silk kimono with intricate floral patterns, her dark hair styled into a classic bun.. be4u7y\ntxt2img mode\n 0%| | 0/4 [00:00<?, ?it/s]\n 25%|██▌ | 1/4 [00:00<00:01, 2.08it/s]\n 50%|█████ | 2/4 [00:00<00:00, 2.60it/s]\n 75%|███████▌ | 3/4 [00:01<00:00, 2.33it/s]\n100%|██████████| 4/4 [00:01<00:00, 2.22it/s]\n100%|██████████| 4/4 [00:01<00:00, 2.27it/s]", "metrics": { "predict_time": 2.372318244, "total_time": 2.380642 }, "output": [ "https://replicate.delivery/yhqm/SXRrnl11m95wNlfDfKGcjHzIaVCMfesvB9a4FWW2BZe5xyqdC/out-0.webp" ], "started_at": "2024-11-03T17:31:57.377324Z", "status": "succeeded", "urls": { "stream": "https://stream.replicate.com/v1/files/qoxq-733hqt4akktikcdigyy2hqambtejm5lno2vxpd6ei2pkgrxhw4ga", "get": "https://api.replicate.com/v1/predictions/nj6x3vv075rj40cjycatsptzsw", "cancel": "https://api.replicate.com/v1/predictions/nj6x3vv075rj40cjycatsptzsw/cancel" }, "version": "1af36f429c0be8b88f7f8adfe6e720d9f1ab3b72ab62c78e768d08e91e4c7b68" }
Generated inUsing seed: 434 Prompt: An elegant Japanese woman in a serene cherry blossom garden, delicate petals floating around her as she gazes at the camera with a gentle smile. She wears a traditional silk kimono with intricate floral patterns, her dark hair styled into a classic bun.. be4u7y txt2img mode 0%| | 0/4 [00:00<?, ?it/s] 25%|██▌ | 1/4 [00:00<00:01, 2.08it/s] 50%|█████ | 2/4 [00:00<00:00, 2.60it/s] 75%|███████▌ | 3/4 [00:01<00:00, 2.33it/s] 100%|██████████| 4/4 [00:01<00:00, 2.22it/s] 100%|██████████| 4/4 [00:01<00:00, 2.27it/s]
Want to make some of these yourself?
Run this model