emaph / outpaint-controlnet-union
Outpaint an image using controlnet union for SDXL.
- Public
- 6.8K runs
-
L40S
Prediction
emaph/outpaint-controlnet-union:4d11bfa652f267e27f480471dd8d435cd92c2b99750644a0b38303d13946aae5IDqnb2qfn9kdrgm0cgz1nsmyk5rgStatusSucceededSourceWebHardwareA40Total durationCreatedInput
- top
- 0
- left
- 400
- image
- https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png
- right
- 400
- bottom
- 0
- prompt
- in the woods, autumn
- output_format
- webp
- output_quality
- 80
- negative_prompt
- ugly
{ "top": 0, "left": 400, "image": "https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png", "right": 400, "bottom": 0, "prompt": "in the woods, autumn", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "emaph/outpaint-controlnet-union:4d11bfa652f267e27f480471dd8d435cd92c2b99750644a0b38303d13946aae5", { input: { top: 0, left: 400, image: "https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png", right: 400, bottom: 0, prompt: "in the woods, autumn", output_format: "webp", output_quality: 80, negative_prompt: "ugly" } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "emaph/outpaint-controlnet-union:4d11bfa652f267e27f480471dd8d435cd92c2b99750644a0b38303d13946aae5", input={ "top": 0, "left": 400, "image": "https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png", "right": 400, "bottom": 0, "prompt": "in the woods, autumn", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "emaph/outpaint-controlnet-union:4d11bfa652f267e27f480471dd8d435cd92c2b99750644a0b38303d13946aae5", "input": { "top": 0, "left": 400, "image": "https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png", "right": 400, "bottom": 0, "prompt": "in the woods, autumn", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
{ "completed_at": "2024-07-28T08:19:23.962157Z", "created_at": "2024-07-28T08:18:43.739000Z", "data_removed": false, "error": null, "id": "qnb2qfn9kdrgm0cgz1nsmyk5rg", "input": { "top": 0, "left": 400, "image": "https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png", "right": 400, "bottom": 0, "prompt": "in the woods, autumn", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" }, "logs": "Random seed set to: 3166590663\nChecking inputs\nDownloading https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png to /tmp/inputs/Comfy-UI-temp-gltth-00003-1.png\n✅ /tmp/inputs/Comfy-UI-temp-gltth-00003-1.png\n====================================\nChecking weights\n⏳ Downloading juggernautXL_v8Rundiffusion.safetensors to ComfyUI/models/checkpoints\n✅ juggernautXL_v8Rundiffusion.safetensors downloaded to ComfyUI/models/checkpoints in 5.22s, size: 6776.19MB\n✅ diffusion_pytorch_model_promax.safetensors exists in ComfyUI/models/controlnet\n====================================\nRunning workflow\ngot prompt\nExecuting node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple\nmodel_type EPS\nUsing pytorch attention in VAE\nUsing pytorch attention in VAE\nloaded straight to GPU\nRequested to load SDXL\nLoading 1 new model\nExecuting node 6, title: Positive, class type: CLIPTextEncode\nRequested to load SDXLClipModel\nLoading 1 new model\nExecuting node 7, title: Negetive, class type: CLIPTextEncode\nExecuting node 16, title: Load ControlNet Model, class type: ControlNetLoader\nExecuting node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType\nExecuting node 11, title: Load Image, class type: LoadImage\nExecuting node 10, title: Pad Image for Outpainting, class type: ImagePadForOutpaint\nExecuting node 48, title: InvertMask, class type: InvertMask\nExecuting node 47, title: Convert Mask to Image, class type: MaskToImage\nExecuting node 49, title: ImageCompositeMasked, class type: ImageCompositeMasked\nExecuting node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced\nExecuting node 39, title: VAE Encode (for Inpainting), class type: VAEEncodeForInpaint\nRequested to load AutoencoderKL\nLoading 1 new model\nExecuting node 3, title: KSampler, class type: KSampler\nRequested to load ControlNet\nLoading 1 new model\n 0%| | 0/21 [00:00<?, ?it/s]\n 5%|▍ | 1/21 [00:00<00:08, 2.24it/s]\n 10%|▉ | 2/21 [00:00<00:07, 2.53it/s]\n 14%|█▍ | 3/21 [00:01<00:06, 2.64it/s]\n 19%|█▉ | 4/21 [00:01<00:06, 2.70it/s]\n 24%|██▍ | 5/21 [00:01<00:05, 2.73it/s]\n 29%|██▊ | 6/21 [00:02<00:05, 2.76it/s]\n 33%|███▎ | 7/21 [00:02<00:05, 2.78it/s]\n 38%|███▊ | 8/21 [00:02<00:04, 2.79it/s]\n 43%|████▎ | 9/21 [00:03<00:04, 2.79it/s]\n 48%|████▊ | 10/21 [00:03<00:03, 2.80it/s]\n 52%|█████▏ | 11/21 [00:04<00:03, 2.80it/s]\n 57%|█████▋ | 12/21 [00:04<00:03, 2.80it/s]\n 62%|██████▏ | 13/21 [00:04<00:02, 2.81it/s]\n 67%|██████▋ | 14/21 [00:05<00:02, 2.81it/s]\n 71%|███████▏ | 15/21 [00:05<00:02, 2.80it/s]\n 76%|███████▌ | 16/21 [00:05<00:01, 2.81it/s]\n 81%|████████ | 17/21 [00:06<00:01, 2.81it/s]\n 86%|████████▌ | 18/21 [00:06<00:01, 2.80it/s]\n 90%|█████████ | 19/21 [00:06<00:00, 2.81it/s]\n 95%|█████████▌| 20/21 [00:07<00:00, 2.80it/s]\n100%|██████████| 21/21 [00:07<00:00, 2.80it/s]\n100%|██████████| 21/21 [00:07<00:00, 2.77it/s]\nExecuting node 8, title: VAE Decode, class type: VAEDecode\nExecuting node 60, title: Save Image, class type: SaveImage\nPrompt executed in 15.05 seconds\noutputs: {'60': {'images': [{'filename': 'outpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}}\n====================================\noutpainted_00001_.png", "metrics": { "predict_time": 22.787967612, "total_time": 40.223157 }, "output": [ "https://replicate.delivery/pbxt/U1Z0eEaAKOyOIyHqM27y7couVFPvmB7gJOGnwwbReT5LE7MTA/outpainted_00001_.webp" ], "started_at": "2024-07-28T08:19:01.174190Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/qnb2qfn9kdrgm0cgz1nsmyk5rg", "cancel": "https://api.replicate.com/v1/predictions/qnb2qfn9kdrgm0cgz1nsmyk5rg/cancel" }, "version": "4d11bfa652f267e27f480471dd8d435cd92c2b99750644a0b38303d13946aae5" }
Generated inRandom seed set to: 3166590663 Checking inputs Downloading https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png to /tmp/inputs/Comfy-UI-temp-gltth-00003-1.png ✅ /tmp/inputs/Comfy-UI-temp-gltth-00003-1.png ==================================== Checking weights ⏳ Downloading juggernautXL_v8Rundiffusion.safetensors to ComfyUI/models/checkpoints ✅ juggernautXL_v8Rundiffusion.safetensors downloaded to ComfyUI/models/checkpoints in 5.22s, size: 6776.19MB ✅ diffusion_pytorch_model_promax.safetensors exists in ComfyUI/models/controlnet ==================================== Running workflow got prompt Executing node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple model_type EPS Using pytorch attention in VAE Using pytorch attention in VAE loaded straight to GPU Requested to load SDXL Loading 1 new model Executing node 6, title: Positive, class type: CLIPTextEncode Requested to load SDXLClipModel Loading 1 new model Executing node 7, title: Negetive, class type: CLIPTextEncode Executing node 16, title: Load ControlNet Model, class type: ControlNetLoader Executing node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType Executing node 11, title: Load Image, class type: LoadImage Executing node 10, title: Pad Image for Outpainting, class type: ImagePadForOutpaint Executing node 48, title: InvertMask, class type: InvertMask Executing node 47, title: Convert Mask to Image, class type: MaskToImage Executing node 49, title: ImageCompositeMasked, class type: ImageCompositeMasked Executing node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced Executing node 39, title: VAE Encode (for Inpainting), class type: VAEEncodeForInpaint Requested to load AutoencoderKL Loading 1 new model Executing node 3, title: KSampler, class type: KSampler Requested to load ControlNet Loading 1 new model 0%| | 0/21 [00:00<?, ?it/s] 5%|▍ | 1/21 [00:00<00:08, 2.24it/s] 10%|▉ | 2/21 [00:00<00:07, 2.53it/s] 14%|█▍ | 3/21 [00:01<00:06, 2.64it/s] 19%|█▉ | 4/21 [00:01<00:06, 2.70it/s] 24%|██▍ | 5/21 [00:01<00:05, 2.73it/s] 29%|██▊ | 6/21 [00:02<00:05, 2.76it/s] 33%|███▎ | 7/21 [00:02<00:05, 2.78it/s] 38%|███▊ | 8/21 [00:02<00:04, 2.79it/s] 43%|████▎ | 9/21 [00:03<00:04, 2.79it/s] 48%|████▊ | 10/21 [00:03<00:03, 2.80it/s] 52%|█████▏ | 11/21 [00:04<00:03, 2.80it/s] 57%|█████▋ | 12/21 [00:04<00:03, 2.80it/s] 62%|██████▏ | 13/21 [00:04<00:02, 2.81it/s] 67%|██████▋ | 14/21 [00:05<00:02, 2.81it/s] 71%|███████▏ | 15/21 [00:05<00:02, 2.80it/s] 76%|███████▌ | 16/21 [00:05<00:01, 2.81it/s] 81%|████████ | 17/21 [00:06<00:01, 2.81it/s] 86%|████████▌ | 18/21 [00:06<00:01, 2.80it/s] 90%|█████████ | 19/21 [00:06<00:00, 2.81it/s] 95%|█████████▌| 20/21 [00:07<00:00, 2.80it/s] 100%|██████████| 21/21 [00:07<00:00, 2.80it/s] 100%|██████████| 21/21 [00:07<00:00, 2.77it/s] Executing node 8, title: VAE Decode, class type: VAEDecode Executing node 60, title: Save Image, class type: SaveImage Prompt executed in 15.05 seconds outputs: {'60': {'images': [{'filename': 'outpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}} ==================================== outpainted_00001_.png
Prediction
emaph/outpaint-controlnet-union:4d11bfa652f267e27f480471dd8d435cd92c2b99750644a0b38303d13946aae5IDsbx0zj2vmnrgm0cgz5pt274q5gStatusSucceededSourceWebHardwareA40Total durationCreatedInput
- top
- 0
- left
- 400
- image
- https://i.postimg.cc/vmY5BZP3/2024-04-16-21-30-59-9956.png
- right
- 400
- bottom
- 0
- prompt
- ice dragon, wings wide open
- output_format
- webp
- output_quality
- 80
- negative_prompt
- ugly
{ "top": 0, "left": 400, "image": "https://i.postimg.cc/vmY5BZP3/2024-04-16-21-30-59-9956.png", "right": 400, "bottom": 0, "prompt": "ice dragon, wings wide open", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "emaph/outpaint-controlnet-union:4d11bfa652f267e27f480471dd8d435cd92c2b99750644a0b38303d13946aae5", { input: { top: 0, left: 400, image: "https://i.postimg.cc/vmY5BZP3/2024-04-16-21-30-59-9956.png", right: 400, bottom: 0, prompt: "ice dragon, wings wide open", output_format: "webp", output_quality: 80, negative_prompt: "ugly" } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "emaph/outpaint-controlnet-union:4d11bfa652f267e27f480471dd8d435cd92c2b99750644a0b38303d13946aae5", input={ "top": 0, "left": 400, "image": "https://i.postimg.cc/vmY5BZP3/2024-04-16-21-30-59-9956.png", "right": 400, "bottom": 0, "prompt": "ice dragon, wings wide open", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "emaph/outpaint-controlnet-union:4d11bfa652f267e27f480471dd8d435cd92c2b99750644a0b38303d13946aae5", "input": { "top": 0, "left": 400, "image": "https://i.postimg.cc/vmY5BZP3/2024-04-16-21-30-59-9956.png", "right": 400, "bottom": 0, "prompt": "ice dragon, wings wide open", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
{ "completed_at": "2024-07-28T13:02:19.473227Z", "created_at": "2024-07-28T13:00:12.069000Z", "data_removed": false, "error": null, "id": "sbx0zj2vmnrgm0cgz5pt274q5g", "input": { "top": 0, "left": 400, "image": "https://i.postimg.cc/vmY5BZP3/2024-04-16-21-30-59-9956.png", "right": 400, "bottom": 0, "prompt": "ice dragon, wings wide open", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" }, "logs": "Random seed set to: 2036446667\nChecking inputs\nDownloading https://i.postimg.cc/vmY5BZP3/2024-04-16-21-30-59-9956.png to /tmp/inputs/2024-04-16-21-30-59-9956.png\n✅ /tmp/inputs/2024-04-16-21-30-59-9956.png\n====================================\nChecking weights\n✅ diffusion_pytorch_model_promax.safetensors exists in ComfyUI/models/controlnet\n⏳ Downloading juggernautXL_v8Rundiffusion.safetensors to ComfyUI/models/checkpoints\n✅ juggernautXL_v8Rundiffusion.safetensors downloaded to ComfyUI/models/checkpoints in 16.15s, size: 6776.19MB\n====================================\nRunning workflow\ngot prompt\nExecuting node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple\nmodel_type EPS\nUsing pytorch attention in VAE\nUsing pytorch attention in VAE\nloaded straight to GPU\nRequested to load SDXL\nLoading 1 new model\nExecuting node 6, title: Positive, class type: CLIPTextEncode\nRequested to load SDXLClipModel\nLoading 1 new model\nExecuting node 7, title: Negetive, class type: CLIPTextEncode\nExecuting node 16, title: Load ControlNet Model, class type: ControlNetLoader\nExecuting node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType\nExecuting node 11, title: Load Image, class type: LoadImage\nExecuting node 10, title: Pad Image for Outpainting, class type: ImagePadForOutpaint\nExecuting node 48, title: InvertMask, class type: InvertMask\nExecuting node 47, title: Convert Mask to Image, class type: MaskToImage\nExecuting node 49, title: ImageCompositeMasked, class type: ImageCompositeMasked\nExecuting node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced\nExecuting node 39, title: VAE Encode (for Inpainting), class type: VAEEncodeForInpaint\nRequested to load AutoencoderKL\nLoading 1 new model\nExecuting node 3, title: KSampler, class type: KSampler\nRequested to load ControlNet\nLoading 1 new model\n 0%| | 0/21 [00:00<?, ?it/s]\n 5%|▍ | 1/21 [00:00<00:08, 2.48it/s]\n 10%|▉ | 2/21 [00:00<00:06, 2.80it/s]\n 14%|█▍ | 3/21 [00:01<00:06, 2.93it/s]\n 19%|█▉ | 4/21 [00:01<00:05, 3.00it/s]\n 24%|██▍ | 5/21 [00:01<00:05, 3.03it/s]\n 29%|██▊ | 6/21 [00:02<00:04, 3.05it/s]\n 33%|███▎ | 7/21 [00:02<00:04, 3.07it/s]\n 38%|███▊ | 8/21 [00:02<00:04, 3.08it/s]\n 43%|████▎ | 9/21 [00:02<00:03, 3.08it/s]\n 48%|████▊ | 10/21 [00:03<00:03, 3.09it/s]\n 52%|█████▏ | 11/21 [00:03<00:03, 3.10it/s]\n 57%|█████▋ | 12/21 [00:03<00:02, 3.10it/s]\n 62%|██████▏ | 13/21 [00:04<00:02, 3.11it/s]\n 67%|██████▋ | 14/21 [00:04<00:02, 3.11it/s]\n 71%|███████▏ | 15/21 [00:04<00:01, 3.11it/s]\n 76%|███████▌ | 16/21 [00:05<00:01, 3.11it/s]\n 81%|████████ | 17/21 [00:05<00:01, 3.11it/s]\n 86%|████████▌ | 18/21 [00:05<00:00, 3.11it/s]\n 90%|█████████ | 19/21 [00:06<00:00, 3.11it/s]\n 95%|█████████▌| 20/21 [00:06<00:00, 3.11it/s]\n100%|██████████| 21/21 [00:06<00:00, 3.11it/s]\n100%|██████████| 21/21 [00:06<00:00, 3.07it/s]\nExecuting node 8, title: VAE Decode, class type: VAEDecode\nExecuting node 60, title: Save Image, class type: SaveImage\nPrompt executed in 13.26 seconds\noutputs: {'60': {'images': [{'filename': 'outpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}}\n====================================\noutpainted_00001_.png", "metrics": { "predict_time": 31.679283715, "total_time": 127.404227 }, "output": [ "https://replicate.delivery/pbxt/krjseJnglBxuaqpAv6G7j7J8xZbhzDHs6d2Gf0MQym2aNfZmA/outpainted_00001_.webp" ], "started_at": "2024-07-28T13:01:47.793943Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/sbx0zj2vmnrgm0cgz5pt274q5g", "cancel": "https://api.replicate.com/v1/predictions/sbx0zj2vmnrgm0cgz5pt274q5g/cancel" }, "version": "4d11bfa652f267e27f480471dd8d435cd92c2b99750644a0b38303d13946aae5" }
Generated inRandom seed set to: 2036446667 Checking inputs Downloading https://i.postimg.cc/vmY5BZP3/2024-04-16-21-30-59-9956.png to /tmp/inputs/2024-04-16-21-30-59-9956.png ✅ /tmp/inputs/2024-04-16-21-30-59-9956.png ==================================== Checking weights ✅ diffusion_pytorch_model_promax.safetensors exists in ComfyUI/models/controlnet ⏳ Downloading juggernautXL_v8Rundiffusion.safetensors to ComfyUI/models/checkpoints ✅ juggernautXL_v8Rundiffusion.safetensors downloaded to ComfyUI/models/checkpoints in 16.15s, size: 6776.19MB ==================================== Running workflow got prompt Executing node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple model_type EPS Using pytorch attention in VAE Using pytorch attention in VAE loaded straight to GPU Requested to load SDXL Loading 1 new model Executing node 6, title: Positive, class type: CLIPTextEncode Requested to load SDXLClipModel Loading 1 new model Executing node 7, title: Negetive, class type: CLIPTextEncode Executing node 16, title: Load ControlNet Model, class type: ControlNetLoader Executing node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType Executing node 11, title: Load Image, class type: LoadImage Executing node 10, title: Pad Image for Outpainting, class type: ImagePadForOutpaint Executing node 48, title: InvertMask, class type: InvertMask Executing node 47, title: Convert Mask to Image, class type: MaskToImage Executing node 49, title: ImageCompositeMasked, class type: ImageCompositeMasked Executing node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced Executing node 39, title: VAE Encode (for Inpainting), class type: VAEEncodeForInpaint Requested to load AutoencoderKL Loading 1 new model Executing node 3, title: KSampler, class type: KSampler Requested to load ControlNet Loading 1 new model 0%| | 0/21 [00:00<?, ?it/s] 5%|▍ | 1/21 [00:00<00:08, 2.48it/s] 10%|▉ | 2/21 [00:00<00:06, 2.80it/s] 14%|█▍ | 3/21 [00:01<00:06, 2.93it/s] 19%|█▉ | 4/21 [00:01<00:05, 3.00it/s] 24%|██▍ | 5/21 [00:01<00:05, 3.03it/s] 29%|██▊ | 6/21 [00:02<00:04, 3.05it/s] 33%|███▎ | 7/21 [00:02<00:04, 3.07it/s] 38%|███▊ | 8/21 [00:02<00:04, 3.08it/s] 43%|████▎ | 9/21 [00:02<00:03, 3.08it/s] 48%|████▊ | 10/21 [00:03<00:03, 3.09it/s] 52%|█████▏ | 11/21 [00:03<00:03, 3.10it/s] 57%|█████▋ | 12/21 [00:03<00:02, 3.10it/s] 62%|██████▏ | 13/21 [00:04<00:02, 3.11it/s] 67%|██████▋ | 14/21 [00:04<00:02, 3.11it/s] 71%|███████▏ | 15/21 [00:04<00:01, 3.11it/s] 76%|███████▌ | 16/21 [00:05<00:01, 3.11it/s] 81%|████████ | 17/21 [00:05<00:01, 3.11it/s] 86%|████████▌ | 18/21 [00:05<00:00, 3.11it/s] 90%|█████████ | 19/21 [00:06<00:00, 3.11it/s] 95%|█████████▌| 20/21 [00:06<00:00, 3.11it/s] 100%|██████████| 21/21 [00:06<00:00, 3.11it/s] 100%|██████████| 21/21 [00:06<00:00, 3.07it/s] Executing node 8, title: VAE Decode, class type: VAEDecode Executing node 60, title: Save Image, class type: SaveImage Prompt executed in 13.26 seconds outputs: {'60': {'images': [{'filename': 'outpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}} ==================================== outpainted_00001_.png
Prediction
emaph/outpaint-controlnet-union:3947eb8cde7f5953837f7f25b5b8ae7bb4e0c65ba38d1662d516c537cd6785e0IDze068ybv79rge0cgyjxt65qq6rStatusSucceededSourceWebHardwareT4Total durationCreatedInput
- top
- 150
- left
- 150
- image
- https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png
- right
- 150
- bottom
- 150
- prompt
- woods
- output_format
- webp
- output_quality
- 80
- negative_prompt
- ugly
{ "top": 150, "left": 150, "image": "https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png", "right": 150, "bottom": 150, "prompt": "woods", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "emaph/outpaint-controlnet-union:3947eb8cde7f5953837f7f25b5b8ae7bb4e0c65ba38d1662d516c537cd6785e0", { input: { top: 150, left: 150, image: "https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png", right: 150, bottom: 150, prompt: "woods", output_format: "webp", output_quality: 80, negative_prompt: "ugly" } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "emaph/outpaint-controlnet-union:3947eb8cde7f5953837f7f25b5b8ae7bb4e0c65ba38d1662d516c537cd6785e0", input={ "top": 150, "left": 150, "image": "https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png", "right": 150, "bottom": 150, "prompt": "woods", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "emaph/outpaint-controlnet-union:3947eb8cde7f5953837f7f25b5b8ae7bb4e0c65ba38d1662d516c537cd6785e0", "input": { "top": 150, "left": 150, "image": "https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png", "right": 150, "bottom": 150, "prompt": "woods", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
{ "completed_at": "2024-07-27T15:14:24.228057Z", "created_at": "2024-07-27T15:07:25.882000Z", "data_removed": false, "error": null, "id": "ze068ybv79rge0cgyjxt65qq6r", "input": { "top": 150, "left": 150, "image": "https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png", "right": 150, "bottom": 150, "prompt": "woods", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" }, "logs": "Random seed set to: 1550023963\nChecking inputs\nDownloading https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png to /tmp/inputs/Comfy-UI-temp-gltth-00003-1.png\n✅ /tmp/inputs/Comfy-UI-temp-gltth-00003-1.png\n====================================\nRunning workflow\ngot prompt\nExecuting node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple\nmodel_type EPS\nUsing pytorch attention in VAE\nUsing pytorch attention in VAE\nloaded straight to GPU\nRequested to load SDXL\nLoading 1 new model\nExecuting node 6, title: Positive, class type: CLIPTextEncode\nRequested to load SDXLClipModel\nLoading 1 new model\nExecuting node 7, title: Negetive, class type: CLIPTextEncode\nExecuting node 16, title: Load ControlNet Model, class type: ControlNetLoader\nExecuting node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType\nExecuting node 11, title: Load Image, class type: LoadImage\nExecuting node 10, title: Pad Image for Outpainting, class type: ImagePadForOutpaint\nExecuting node 48, title: InvertMask, class type: InvertMask\nExecuting node 47, title: Convert Mask to Image, class type: MaskToImage\nExecuting node 49, title: ImageCompositeMasked, class type: ImageCompositeMasked\nExecuting node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced\nExecuting node 39, title: VAE Encode (for Inpainting), class type: VAEEncodeForInpaint\nRequested to load AutoencoderKL\nLoading 1 new model\nExecuting node 3, title: KSampler, class type: KSampler\nRequested to load ControlNet\nLoading 1 new model\n 0%| | 0/26 [00:00<?, ?it/s]\n 4%|▍ | 1/26 [00:01<00:38, 1.56s/it]\n 8%|▊ | 2/26 [00:02<00:35, 1.48s/it]\n 12%|█▏ | 3/26 [00:04<00:33, 1.45s/it]\n 15%|█▌ | 4/26 [00:05<00:31, 1.44s/it]\n 19%|█▉ | 5/26 [00:07<00:30, 1.44s/it]\n 23%|██▎ | 6/26 [00:08<00:28, 1.44s/it]\n 27%|██▋ | 7/26 [00:10<00:27, 1.44s/it]\n 31%|███ | 8/26 [00:11<00:25, 1.44s/it]\n 35%|███▍ | 9/26 [00:13<00:24, 1.44s/it]\n 38%|███▊ | 10/26 [00:14<00:23, 1.44s/it]\n 42%|████▏ | 11/26 [00:15<00:21, 1.44s/it]\n 46%|████▌ | 12/26 [00:17<00:20, 1.44s/it]\n 50%|█████ | 13/26 [00:18<00:18, 1.45s/it]\n 54%|█████▍ | 14/26 [00:20<00:17, 1.45s/it]\n 58%|█████▊ | 15/26 [00:21<00:15, 1.45s/it]\n 62%|██████▏ | 16/26 [00:23<00:14, 1.45s/it]\n 65%|██████▌ | 17/26 [00:24<00:13, 1.46s/it]\n 69%|██████▉ | 18/26 [00:26<00:11, 1.46s/it]\n 73%|███████▎ | 19/26 [00:27<00:10, 1.45s/it]\n 77%|███████▋ | 20/26 [00:28<00:08, 1.45s/it]\n 81%|████████ | 21/26 [00:30<00:07, 1.46s/it]\n 85%|████████▍ | 22/26 [00:31<00:05, 1.46s/it]\n 88%|████████▊ | 23/26 [00:33<00:04, 1.47s/it]\n 92%|█████████▏| 24/26 [00:34<00:02, 1.47s/it]\n 96%|█████████▌| 25/26 [00:36<00:01, 1.47s/it]\n100%|██████████| 26/26 [00:37<00:00, 1.47s/it]\n100%|██████████| 26/26 [00:37<00:00, 1.45s/it]\nExecuting node 8, title: VAE Decode, class type: VAEDecode\nExecuting node 58, title: ImageCompositeMasked, class type: ImageCompositeMasked\nExecuting node 65, title: Save Image, class type: SaveImage\nPrompt executed in 59.14 seconds\noutputs: {'65': {'images': [{'filename': 'outpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}}\n====================================\noutpainted_00001_.png", "metrics": { "predict_time": 60.561539476, "total_time": 418.346057 }, "output": [ "https://replicate.delivery/czjl/iKXnwhEzD17iCppChlfGrrwtFdfEeTReCVrNieh0tRHFaglZC/outpainted_00001_.webp" ], "started_at": "2024-07-27T15:13:23.666518Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/ze068ybv79rge0cgyjxt65qq6r", "cancel": "https://api.replicate.com/v1/predictions/ze068ybv79rge0cgyjxt65qq6r/cancel" }, "version": "3947eb8cde7f5953837f7f25b5b8ae7bb4e0c65ba38d1662d516c537cd6785e0" }
Generated inRandom seed set to: 1550023963 Checking inputs Downloading https://i.postimg.cc/MGbGzzvG/Comfy-UI-temp-gltth-00003-1.png to /tmp/inputs/Comfy-UI-temp-gltth-00003-1.png ✅ /tmp/inputs/Comfy-UI-temp-gltth-00003-1.png ==================================== Running workflow got prompt Executing node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple model_type EPS Using pytorch attention in VAE Using pytorch attention in VAE loaded straight to GPU Requested to load SDXL Loading 1 new model Executing node 6, title: Positive, class type: CLIPTextEncode Requested to load SDXLClipModel Loading 1 new model Executing node 7, title: Negetive, class type: CLIPTextEncode Executing node 16, title: Load ControlNet Model, class type: ControlNetLoader Executing node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType Executing node 11, title: Load Image, class type: LoadImage Executing node 10, title: Pad Image for Outpainting, class type: ImagePadForOutpaint Executing node 48, title: InvertMask, class type: InvertMask Executing node 47, title: Convert Mask to Image, class type: MaskToImage Executing node 49, title: ImageCompositeMasked, class type: ImageCompositeMasked Executing node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced Executing node 39, title: VAE Encode (for Inpainting), class type: VAEEncodeForInpaint Requested to load AutoencoderKL Loading 1 new model Executing node 3, title: KSampler, class type: KSampler Requested to load ControlNet Loading 1 new model 0%| | 0/26 [00:00<?, ?it/s] 4%|▍ | 1/26 [00:01<00:38, 1.56s/it] 8%|▊ | 2/26 [00:02<00:35, 1.48s/it] 12%|█▏ | 3/26 [00:04<00:33, 1.45s/it] 15%|█▌ | 4/26 [00:05<00:31, 1.44s/it] 19%|█▉ | 5/26 [00:07<00:30, 1.44s/it] 23%|██▎ | 6/26 [00:08<00:28, 1.44s/it] 27%|██▋ | 7/26 [00:10<00:27, 1.44s/it] 31%|███ | 8/26 [00:11<00:25, 1.44s/it] 35%|███▍ | 9/26 [00:13<00:24, 1.44s/it] 38%|███▊ | 10/26 [00:14<00:23, 1.44s/it] 42%|████▏ | 11/26 [00:15<00:21, 1.44s/it] 46%|████▌ | 12/26 [00:17<00:20, 1.44s/it] 50%|█████ | 13/26 [00:18<00:18, 1.45s/it] 54%|█████▍ | 14/26 [00:20<00:17, 1.45s/it] 58%|█████▊ | 15/26 [00:21<00:15, 1.45s/it] 62%|██████▏ | 16/26 [00:23<00:14, 1.45s/it] 65%|██████▌ | 17/26 [00:24<00:13, 1.46s/it] 69%|██████▉ | 18/26 [00:26<00:11, 1.46s/it] 73%|███████▎ | 19/26 [00:27<00:10, 1.45s/it] 77%|███████▋ | 20/26 [00:28<00:08, 1.45s/it] 81%|████████ | 21/26 [00:30<00:07, 1.46s/it] 85%|████████▍ | 22/26 [00:31<00:05, 1.46s/it] 88%|████████▊ | 23/26 [00:33<00:04, 1.47s/it] 92%|█████████▏| 24/26 [00:34<00:02, 1.47s/it] 96%|█████████▌| 25/26 [00:36<00:01, 1.47s/it] 100%|██████████| 26/26 [00:37<00:00, 1.47s/it] 100%|██████████| 26/26 [00:37<00:00, 1.45s/it] Executing node 8, title: VAE Decode, class type: VAEDecode Executing node 58, title: ImageCompositeMasked, class type: ImageCompositeMasked Executing node 65, title: Save Image, class type: SaveImage Prompt executed in 59.14 seconds outputs: {'65': {'images': [{'filename': 'outpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}} ==================================== outpainted_00001_.png
Prediction
emaph/outpaint-controlnet-union:377564d35153c66f8629d9540480813685d114f0552e9a3c9ffe5dd315091e68ID0gk4cbbtq5rgj0cgza0rqaxtm8StatusSucceededSourceWebHardwareA40Total durationCreatedInput
- cfg
- 4
- top
- 0
- left
- 400
- right
- 400
- steps
- 20
- bottom
- 0
- prompt
- japanese village, relaxing, anime drawing
- output_format
- webp
- output_quality
- 80
- negative_prompt
- ugly
{ "cfg": 4, "top": 0, "left": 400, "image": "https://replicate.delivery/pbxt/LLeEjiDlyLmiJLR2yh7rsiijobX9cWaTOvLTJfwqmcHFivr3/00008-81734364.png", "right": 400, "steps": 20, "bottom": 0, "prompt": "japanese village, relaxing, anime drawing", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "emaph/outpaint-controlnet-union:377564d35153c66f8629d9540480813685d114f0552e9a3c9ffe5dd315091e68", { input: { cfg: 4, top: 0, left: 400, image: "https://replicate.delivery/pbxt/LLeEjiDlyLmiJLR2yh7rsiijobX9cWaTOvLTJfwqmcHFivr3/00008-81734364.png", right: 400, steps: 20, bottom: 0, prompt: "japanese village, relaxing, anime drawing", output_format: "webp", output_quality: 80, negative_prompt: "ugly" } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "emaph/outpaint-controlnet-union:377564d35153c66f8629d9540480813685d114f0552e9a3c9ffe5dd315091e68", input={ "cfg": 4, "top": 0, "left": 400, "image": "https://replicate.delivery/pbxt/LLeEjiDlyLmiJLR2yh7rsiijobX9cWaTOvLTJfwqmcHFivr3/00008-81734364.png", "right": 400, "steps": 20, "bottom": 0, "prompt": "japanese village, relaxing, anime drawing", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "emaph/outpaint-controlnet-union:377564d35153c66f8629d9540480813685d114f0552e9a3c9ffe5dd315091e68", "input": { "cfg": 4, "top": 0, "left": 400, "image": "https://replicate.delivery/pbxt/LLeEjiDlyLmiJLR2yh7rsiijobX9cWaTOvLTJfwqmcHFivr3/00008-81734364.png", "right": 400, "steps": 20, "bottom": 0, "prompt": "japanese village, relaxing, anime drawing", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
{ "completed_at": "2024-07-28T18:03:31.351739Z", "created_at": "2024-07-28T18:01:47.961000Z", "data_removed": false, "error": null, "id": "0gk4cbbtq5rgj0cgza0rqaxtm8", "input": { "cfg": 4, "top": 0, "left": 400, "image": "https://replicate.delivery/pbxt/LLeEjiDlyLmiJLR2yh7rsiijobX9cWaTOvLTJfwqmcHFivr3/00008-81734364.png", "right": 400, "steps": 20, "bottom": 0, "prompt": "japanese village, relaxing, anime drawing", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" }, "logs": "Random seed set to: 577323934\nChecking inputs\n✅ /tmp/inputs/image.png\n====================================\nChecking weights\n⏳ Downloading juggernautXL_v8Rundiffusion.safetensors to ComfyUI/models/checkpoints\n✅ juggernautXL_v8Rundiffusion.safetensors downloaded to ComfyUI/models/checkpoints in 3.90s, size: 6776.19MB\n✅ diffusion_pytorch_model_promax.safetensors exists in ComfyUI/models/controlnet\n====================================\nRunning workflow\ngot prompt\nExecuting node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple\nmodel_type EPS\nUsing pytorch attention in VAE\nUsing pytorch attention in VAE\nloaded straight to GPU\nRequested to load SDXL\nLoading 1 new model\nExecuting node 6, title: Positive, class type: CLIPTextEncode\nRequested to load SDXLClipModel\nLoading 1 new model\nExecuting node 7, title: Negetive, class type: CLIPTextEncode\nExecuting node 16, title: Load ControlNet Model, class type: ControlNetLoader\nExecuting node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType\nExecuting node 11, title: Load Image, class type: LoadImage\nExecuting node 10, title: Pad Image for Outpainting, class type: ImagePadForOutpaint\nExecuting node 48, title: InvertMask, class type: InvertMask\nExecuting node 47, title: Convert Mask to Image, class type: MaskToImage\nExecuting node 49, title: ImageCompositeMasked, class type: ImageCompositeMasked\nExecuting node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced\nExecuting node 39, title: VAE Encode (for Inpainting), class type: VAEEncodeForInpaint\nRequested to load AutoencoderKL\nLoading 1 new model\nExecuting node 3, title: KSampler, class type: KSampler\nRequested to load ControlNet\nLoading 1 new model\n 0%| | 0/20 [00:00<?, ?it/s]\n 5%|▌ | 1/20 [00:00<00:06, 3.14it/s]\n 10%|█ | 2/20 [00:00<00:04, 3.69it/s]\n 15%|█▌ | 3/20 [00:00<00:04, 3.94it/s]\n 20%|██ | 4/20 [00:01<00:03, 4.06it/s]\n 25%|██▌ | 5/20 [00:01<00:03, 4.11it/s]\n 30%|███ | 6/20 [00:01<00:03, 4.16it/s]\n 35%|███▌ | 7/20 [00:01<00:03, 4.19it/s]\n 40%|████ | 8/20 [00:01<00:02, 4.20it/s]\n 45%|████▌ | 9/20 [00:02<00:02, 4.22it/s]\n 50%|█████ | 10/20 [00:02<00:02, 4.23it/s]\n 55%|█████▌ | 11/20 [00:02<00:02, 4.25it/s]\n 60%|██████ | 12/20 [00:02<00:01, 4.26it/s]\n 65%|██████▌ | 13/20 [00:03<00:01, 4.26it/s]\n 70%|███████ | 14/20 [00:03<00:01, 4.26it/s]\n 75%|███████▌ | 15/20 [00:03<00:01, 4.27it/s]\n 80%|████████ | 16/20 [00:03<00:00, 4.27it/s]\n 85%|████████▌ | 17/20 [00:04<00:00, 4.27it/s]\n 90%|█████████ | 18/20 [00:04<00:00, 4.27it/s]\n 95%|█████████▌| 19/20 [00:04<00:00, 4.27it/s]\n100%|██████████| 20/20 [00:04<00:00, 4.27it/s]\n100%|██████████| 20/20 [00:04<00:00, 4.19it/s]\nExecuting node 8, title: VAE Decode, class type: VAEDecode\nExecuting node 60, title: Save Image, class type: SaveImage\nPrompt executed in 10.48 seconds\noutputs: {'60': {'images': [{'filename': 'outpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}}\n====================================\noutpainted_00001_.png", "metrics": { "predict_time": 16.165886814, "total_time": 103.390739 }, "output": [ "https://replicate.delivery/pbxt/dFQ3cBfGWNztViJlIeOfrfQiPjPFfDRjx4qCGIbGlX8Se5QzE/outpainted_00001_.webp" ], "started_at": "2024-07-28T18:03:15.185852Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/0gk4cbbtq5rgj0cgza0rqaxtm8", "cancel": "https://api.replicate.com/v1/predictions/0gk4cbbtq5rgj0cgza0rqaxtm8/cancel" }, "version": "377564d35153c66f8629d9540480813685d114f0552e9a3c9ffe5dd315091e68" }
Generated inRandom seed set to: 577323934 Checking inputs ✅ /tmp/inputs/image.png ==================================== Checking weights ⏳ Downloading juggernautXL_v8Rundiffusion.safetensors to ComfyUI/models/checkpoints ✅ juggernautXL_v8Rundiffusion.safetensors downloaded to ComfyUI/models/checkpoints in 3.90s, size: 6776.19MB ✅ diffusion_pytorch_model_promax.safetensors exists in ComfyUI/models/controlnet ==================================== Running workflow got prompt Executing node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple model_type EPS Using pytorch attention in VAE Using pytorch attention in VAE loaded straight to GPU Requested to load SDXL Loading 1 new model Executing node 6, title: Positive, class type: CLIPTextEncode Requested to load SDXLClipModel Loading 1 new model Executing node 7, title: Negetive, class type: CLIPTextEncode Executing node 16, title: Load ControlNet Model, class type: ControlNetLoader Executing node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType Executing node 11, title: Load Image, class type: LoadImage Executing node 10, title: Pad Image for Outpainting, class type: ImagePadForOutpaint Executing node 48, title: InvertMask, class type: InvertMask Executing node 47, title: Convert Mask to Image, class type: MaskToImage Executing node 49, title: ImageCompositeMasked, class type: ImageCompositeMasked Executing node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced Executing node 39, title: VAE Encode (for Inpainting), class type: VAEEncodeForInpaint Requested to load AutoencoderKL Loading 1 new model Executing node 3, title: KSampler, class type: KSampler Requested to load ControlNet Loading 1 new model 0%| | 0/20 [00:00<?, ?it/s] 5%|▌ | 1/20 [00:00<00:06, 3.14it/s] 10%|█ | 2/20 [00:00<00:04, 3.69it/s] 15%|█▌ | 3/20 [00:00<00:04, 3.94it/s] 20%|██ | 4/20 [00:01<00:03, 4.06it/s] 25%|██▌ | 5/20 [00:01<00:03, 4.11it/s] 30%|███ | 6/20 [00:01<00:03, 4.16it/s] 35%|███▌ | 7/20 [00:01<00:03, 4.19it/s] 40%|████ | 8/20 [00:01<00:02, 4.20it/s] 45%|████▌ | 9/20 [00:02<00:02, 4.22it/s] 50%|█████ | 10/20 [00:02<00:02, 4.23it/s] 55%|█████▌ | 11/20 [00:02<00:02, 4.25it/s] 60%|██████ | 12/20 [00:02<00:01, 4.26it/s] 65%|██████▌ | 13/20 [00:03<00:01, 4.26it/s] 70%|███████ | 14/20 [00:03<00:01, 4.26it/s] 75%|███████▌ | 15/20 [00:03<00:01, 4.27it/s] 80%|████████ | 16/20 [00:03<00:00, 4.27it/s] 85%|████████▌ | 17/20 [00:04<00:00, 4.27it/s] 90%|█████████ | 18/20 [00:04<00:00, 4.27it/s] 95%|█████████▌| 19/20 [00:04<00:00, 4.27it/s] 100%|██████████| 20/20 [00:04<00:00, 4.27it/s] 100%|██████████| 20/20 [00:04<00:00, 4.19it/s] Executing node 8, title: VAE Decode, class type: VAEDecode Executing node 60, title: Save Image, class type: SaveImage Prompt executed in 10.48 seconds outputs: {'60': {'images': [{'filename': 'outpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}} ==================================== outpainted_00001_.png
Prediction
emaph/outpaint-controlnet-union:3947eb8cde7f5953837f7f25b5b8ae7bb4e0c65ba38d1662d516c537cd6785e0ID6fvgec3bx9rge0cgyq2vpehgamStatusSucceededSourceWebHardwareT4Total durationCreatedInput
- top
- 150
- left
- 150
- image
- https://i.postimg.cc/PrR5JK6c/Comfy-UI-temp-xdxpz-00001.png
- right
- 150
- bottom
- 150
- prompt
- grass field, sunset, clouds
- output_format
- webp
- output_quality
- 80
- negative_prompt
- ugly
{ "top": 150, "left": 150, "image": "https://i.postimg.cc/PrR5JK6c/Comfy-UI-temp-xdxpz-00001.png", "right": 150, "bottom": 150, "prompt": "grass field, sunset, clouds", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "emaph/outpaint-controlnet-union:3947eb8cde7f5953837f7f25b5b8ae7bb4e0c65ba38d1662d516c537cd6785e0", { input: { top: 150, left: 150, image: "https://i.postimg.cc/PrR5JK6c/Comfy-UI-temp-xdxpz-00001.png", right: 150, bottom: 150, prompt: "grass field, sunset, clouds", output_format: "webp", output_quality: 80, negative_prompt: "ugly" } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "emaph/outpaint-controlnet-union:3947eb8cde7f5953837f7f25b5b8ae7bb4e0c65ba38d1662d516c537cd6785e0", input={ "top": 150, "left": 150, "image": "https://i.postimg.cc/PrR5JK6c/Comfy-UI-temp-xdxpz-00001.png", "right": 150, "bottom": 150, "prompt": "grass field, sunset, clouds", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run emaph/outpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "emaph/outpaint-controlnet-union:3947eb8cde7f5953837f7f25b5b8ae7bb4e0c65ba38d1662d516c537cd6785e0", "input": { "top": 150, "left": 150, "image": "https://i.postimg.cc/PrR5JK6c/Comfy-UI-temp-xdxpz-00001.png", "right": 150, "bottom": 150, "prompt": "grass field, sunset, clouds", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
{ "completed_at": "2024-07-27T20:03:17.987029Z", "created_at": "2024-07-27T19:57:54.538000Z", "data_removed": false, "error": null, "id": "6fvgec3bx9rge0cgyq2vpehgam", "input": { "top": 150, "left": 150, "image": "https://i.postimg.cc/PrR5JK6c/Comfy-UI-temp-xdxpz-00001.png", "right": 150, "bottom": 150, "prompt": "grass field, sunset, clouds", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly" }, "logs": "Random seed set to: 3062084987\nChecking inputs\nDownloading https://i.postimg.cc/PrR5JK6c/Comfy-UI-temp-xdxpz-00001.png to /tmp/inputs/Comfy-UI-temp-xdxpz-00001.png\n✅ /tmp/inputs/Comfy-UI-temp-xdxpz-00001.png\n====================================\nRunning workflow\ngot prompt\nExecuting node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple\nmodel_type EPS\nUsing pytorch attention in VAE\nUsing pytorch attention in VAE\nloaded straight to GPU\nRequested to load SDXL\nLoading 1 new model\nExecuting node 6, title: Positive, class type: CLIPTextEncode\nRequested to load SDXLClipModel\nLoading 1 new model\nExecuting node 7, title: Negetive, class type: CLIPTextEncode\nExecuting node 16, title: Load ControlNet Model, class type: ControlNetLoader\nExecuting node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType\nExecuting node 11, title: Load Image, class type: LoadImage\nExecuting node 10, title: Pad Image for Outpainting, class type: ImagePadForOutpaint\nExecuting node 48, title: InvertMask, class type: InvertMask\nExecuting node 47, title: Convert Mask to Image, class type: MaskToImage\nExecuting node 49, title: ImageCompositeMasked, class type: ImageCompositeMasked\nExecuting node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced\nExecuting node 39, title: VAE Encode (for Inpainting), class type: VAEEncodeForInpaint\nRequested to load AutoencoderKL\nLoading 1 new model\nExecuting node 3, title: KSampler, class type: KSampler\nRequested to load ControlNet\nLoading 1 new model\n 0%| | 0/26 [00:00<?, ?it/s]\n 4%|▍ | 1/26 [00:01<00:37, 1.48s/it]\n 8%|▊ | 2/26 [00:02<00:34, 1.43s/it]\n 12%|█▏ | 3/26 [00:04<00:32, 1.41s/it]\n 15%|█▌ | 4/26 [00:05<00:30, 1.40s/it]\n 19%|█▉ | 5/26 [00:07<00:29, 1.40s/it]\n 23%|██▎ | 6/26 [00:08<00:27, 1.40s/it]\n 27%|██▋ | 7/26 [00:09<00:26, 1.40s/it]\n 31%|███ | 8/26 [00:11<00:25, 1.40s/it]\n 35%|███▍ | 9/26 [00:12<00:23, 1.41s/it]\n 38%|███▊ | 10/26 [00:14<00:22, 1.40s/it]\n 42%|████▏ | 11/26 [00:15<00:21, 1.41s/it]\n 46%|████▌ | 12/26 [00:16<00:19, 1.41s/it]\n 50%|█████ | 13/26 [00:18<00:18, 1.41s/it]\n 54%|█████▍ | 14/26 [00:19<00:16, 1.41s/it]\n 58%|█████▊ | 15/26 [00:21<00:15, 1.42s/it]\n 62%|██████▏ | 16/26 [00:22<00:14, 1.42s/it]\n 65%|██████▌ | 17/26 [00:23<00:12, 1.42s/it]\n 69%|██████▉ | 18/26 [00:25<00:11, 1.43s/it]\n 73%|███████▎ | 19/26 [00:26<00:10, 1.43s/it]\n 77%|███████▋ | 20/26 [00:28<00:08, 1.44s/it]\n 81%|████████ | 21/26 [00:29<00:07, 1.44s/it]\n 85%|████████▍ | 22/26 [00:31<00:05, 1.45s/it]\n 88%|████████▊ | 23/26 [00:32<00:04, 1.45s/it]\n 92%|█████████▏| 24/26 [00:34<00:02, 1.46s/it]\n 96%|█████████▌| 25/26 [00:35<00:01, 1.46s/it]\n100%|██████████| 26/26 [00:37<00:00, 1.47s/it]\n100%|██████████| 26/26 [00:37<00:00, 1.43s/it]\nExecuting node 8, title: VAE Decode, class type: VAEDecode\nExecuting node 58, title: ImageCompositeMasked, class type: ImageCompositeMasked\nExecuting node 65, title: Save Image, class type: SaveImage\nPrompt executed in 53.09 seconds\noutputs: {'65': {'images': [{'filename': 'outpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}}\n====================================\noutpainted_00001_.png", "metrics": { "predict_time": 54.461539476, "total_time": 323.449029 }, "output": [ "https://replicate.delivery/czjl/cBsfmDDMoEzOdy4bF517fnm4XxVKns8vNR4dJx3ODv8FSwMTA/outpainted_00001_.webp" ], "started_at": "2024-07-27T20:02:23.525489Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/6fvgec3bx9rge0cgyq2vpehgam", "cancel": "https://api.replicate.com/v1/predictions/6fvgec3bx9rge0cgyq2vpehgam/cancel" }, "version": "3947eb8cde7f5953837f7f25b5b8ae7bb4e0c65ba38d1662d516c537cd6785e0" }
Generated inRandom seed set to: 3062084987 Checking inputs Downloading https://i.postimg.cc/PrR5JK6c/Comfy-UI-temp-xdxpz-00001.png to /tmp/inputs/Comfy-UI-temp-xdxpz-00001.png ✅ /tmp/inputs/Comfy-UI-temp-xdxpz-00001.png ==================================== Running workflow got prompt Executing node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple model_type EPS Using pytorch attention in VAE Using pytorch attention in VAE loaded straight to GPU Requested to load SDXL Loading 1 new model Executing node 6, title: Positive, class type: CLIPTextEncode Requested to load SDXLClipModel Loading 1 new model Executing node 7, title: Negetive, class type: CLIPTextEncode Executing node 16, title: Load ControlNet Model, class type: ControlNetLoader Executing node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType Executing node 11, title: Load Image, class type: LoadImage Executing node 10, title: Pad Image for Outpainting, class type: ImagePadForOutpaint Executing node 48, title: InvertMask, class type: InvertMask Executing node 47, title: Convert Mask to Image, class type: MaskToImage Executing node 49, title: ImageCompositeMasked, class type: ImageCompositeMasked Executing node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced Executing node 39, title: VAE Encode (for Inpainting), class type: VAEEncodeForInpaint Requested to load AutoencoderKL Loading 1 new model Executing node 3, title: KSampler, class type: KSampler Requested to load ControlNet Loading 1 new model 0%| | 0/26 [00:00<?, ?it/s] 4%|▍ | 1/26 [00:01<00:37, 1.48s/it] 8%|▊ | 2/26 [00:02<00:34, 1.43s/it] 12%|█▏ | 3/26 [00:04<00:32, 1.41s/it] 15%|█▌ | 4/26 [00:05<00:30, 1.40s/it] 19%|█▉ | 5/26 [00:07<00:29, 1.40s/it] 23%|██▎ | 6/26 [00:08<00:27, 1.40s/it] 27%|██▋ | 7/26 [00:09<00:26, 1.40s/it] 31%|███ | 8/26 [00:11<00:25, 1.40s/it] 35%|███▍ | 9/26 [00:12<00:23, 1.41s/it] 38%|███▊ | 10/26 [00:14<00:22, 1.40s/it] 42%|████▏ | 11/26 [00:15<00:21, 1.41s/it] 46%|████▌ | 12/26 [00:16<00:19, 1.41s/it] 50%|█████ | 13/26 [00:18<00:18, 1.41s/it] 54%|█████▍ | 14/26 [00:19<00:16, 1.41s/it] 58%|█████▊ | 15/26 [00:21<00:15, 1.42s/it] 62%|██████▏ | 16/26 [00:22<00:14, 1.42s/it] 65%|██████▌ | 17/26 [00:23<00:12, 1.42s/it] 69%|██████▉ | 18/26 [00:25<00:11, 1.43s/it] 73%|███████▎ | 19/26 [00:26<00:10, 1.43s/it] 77%|███████▋ | 20/26 [00:28<00:08, 1.44s/it] 81%|████████ | 21/26 [00:29<00:07, 1.44s/it] 85%|████████▍ | 22/26 [00:31<00:05, 1.45s/it] 88%|████████▊ | 23/26 [00:32<00:04, 1.45s/it] 92%|█████████▏| 24/26 [00:34<00:02, 1.46s/it] 96%|█████████▌| 25/26 [00:35<00:01, 1.46s/it] 100%|██████████| 26/26 [00:37<00:00, 1.47s/it] 100%|██████████| 26/26 [00:37<00:00, 1.43s/it] Executing node 8, title: VAE Decode, class type: VAEDecode Executing node 58, title: ImageCompositeMasked, class type: ImageCompositeMasked Executing node 65, title: Save Image, class type: SaveImage Prompt executed in 53.09 seconds outputs: {'65': {'images': [{'filename': 'outpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}} ==================================== outpainted_00001_.png
Want to make some of these yourself?
Run this model