emaph / inpaint-controlnet-union
Inpaint a selected area of an image using controlnet union for SDXL.
Prediction
emaph/inpaint-controlnet-union:ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198ID2f9ncrtbw1rgg0cgzamratmkr8StatusSucceededSourceWebHardwareA40Total durationCreatedInput
- cfg
- 4
- steps
- 20
- prompt
- a dolphin from the ocean waves, sunset
- output_format
- webp
- output_quality
- 80
- negative_prompt
{ "cfg": 4, "mask": "https://replicate.delivery/pbxt/LLetvzB8jNw1kXnmMlneRpJjzPCUdnWN9FVDGVnetf8ALv1t/inpaint-mask-real.png", "image": "https://replicate.delivery/pbxt/LLetwiUV4Xhhxk0Goa7MZ4gWPON2yOLHwj5RNH4Bc6uTQ3jS/inpaint-full-real.png", "steps": 20, "prompt": "a dolphin from the ocean waves, sunset", "output_format": "webp", "output_quality": 80, "negative_prompt": "" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run emaph/inpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "emaph/inpaint-controlnet-union:ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198", { input: { cfg: 4, mask: "https://replicate.delivery/pbxt/LLetvzB8jNw1kXnmMlneRpJjzPCUdnWN9FVDGVnetf8ALv1t/inpaint-mask-real.png", image: "https://replicate.delivery/pbxt/LLetwiUV4Xhhxk0Goa7MZ4gWPON2yOLHwj5RNH4Bc6uTQ3jS/inpaint-full-real.png", steps: 20, prompt: "a dolphin from the ocean waves, sunset", output_format: "webp", output_quality: 80, negative_prompt: "" } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run emaph/inpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "emaph/inpaint-controlnet-union:ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198", input={ "cfg": 4, "mask": "https://replicate.delivery/pbxt/LLetvzB8jNw1kXnmMlneRpJjzPCUdnWN9FVDGVnetf8ALv1t/inpaint-mask-real.png", "image": "https://replicate.delivery/pbxt/LLetwiUV4Xhhxk0Goa7MZ4gWPON2yOLHwj5RNH4Bc6uTQ3jS/inpaint-full-real.png", "steps": 20, "prompt": "a dolphin from the ocean waves, sunset", "output_format": "webp", "output_quality": 80, "negative_prompt": "" } ) # To access the file URL: print(output[0].url()) #=> "http://example.com" # To write the file to disk: with open("my-image.png", "wb") as file: file.write(output[0].read())
To learn more, take a look at the guide on getting started with Python.
Run emaph/inpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "emaph/inpaint-controlnet-union:ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198", "input": { "cfg": 4, "mask": "https://replicate.delivery/pbxt/LLetvzB8jNw1kXnmMlneRpJjzPCUdnWN9FVDGVnetf8ALv1t/inpaint-mask-real.png", "image": "https://replicate.delivery/pbxt/LLetwiUV4Xhhxk0Goa7MZ4gWPON2yOLHwj5RNH4Bc6uTQ3jS/inpaint-full-real.png", "steps": 20, "prompt": "a dolphin from the ocean waves, sunset", "output_format": "webp", "output_quality": 80, "negative_prompt": "" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/emaph/inpaint-controlnet-union@sha256:ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198 \ -i 'cfg=4' \ -i 'mask="https://replicate.delivery/pbxt/LLetvzB8jNw1kXnmMlneRpJjzPCUdnWN9FVDGVnetf8ALv1t/inpaint-mask-real.png"' \ -i 'image="https://replicate.delivery/pbxt/LLetwiUV4Xhhxk0Goa7MZ4gWPON2yOLHwj5RNH4Bc6uTQ3jS/inpaint-full-real.png"' \ -i 'steps=20' \ -i 'prompt="a dolphin from the ocean waves, sunset"' \ -i 'output_format="webp"' \ -i 'output_quality=80' \ -i 'negative_prompt=""'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/emaph/inpaint-controlnet-union@sha256:ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "cfg": 4, "mask": "https://replicate.delivery/pbxt/LLetvzB8jNw1kXnmMlneRpJjzPCUdnWN9FVDGVnetf8ALv1t/inpaint-mask-real.png", "image": "https://replicate.delivery/pbxt/LLetwiUV4Xhhxk0Goa7MZ4gWPON2yOLHwj5RNH4Bc6uTQ3jS/inpaint-full-real.png", "steps": 20, "prompt": "a dolphin from the ocean waves, sunset", "output_format": "webp", "output_quality": 80, "negative_prompt": "" } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2024-07-28T18:47:07.656359Z", "created_at": "2024-07-28T18:45:17.408000Z", "data_removed": false, "error": null, "id": "2f9ncrtbw1rgg0cgzamratmkr8", "input": { "cfg": 4, "mask": "https://replicate.delivery/pbxt/LLetvzB8jNw1kXnmMlneRpJjzPCUdnWN9FVDGVnetf8ALv1t/inpaint-mask-real.png", "image": "https://replicate.delivery/pbxt/LLetwiUV4Xhhxk0Goa7MZ4gWPON2yOLHwj5RNH4Bc6uTQ3jS/inpaint-full-real.png", "steps": 20, "prompt": "a dolphin from the ocean waves, sunset", "output_format": "webp", "output_quality": 80, "negative_prompt": "" }, "logs": "Random seed set to: 1242870280\nChecking inputs\n✅ /tmp/inputs/image.png\n✅ /tmp/inputs/mask.png\n====================================\nChecking weights\n✅ diffusion_pytorch_model_promax.safetensors exists in ComfyUI/models/controlnet\n✅ juggernautXL_v8Rundiffusion.safetensors exists in ComfyUI/models/checkpoints\n====================================\nRunning workflow\ngot prompt\nExecuting node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple\nmodel_type EPS\nUsing pytorch attention in VAE\nUsing pytorch attention in VAE\nloaded straight to GPU\nRequested to load SDXL\nLoading 1 new model\nExecuting node 6, title: Positive, class type: CLIPTextEncode\nRequested to load SDXLClipModel\nLoading 1 new model\nExecuting node 7, title: Negetive, class type: CLIPTextEncode\nExecuting node 16, title: Load ControlNet Model, class type: ControlNetLoader\nExecuting node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType\nExecuting node 11, title: Load Image, class type: LoadImage\nExecuting node 71, title: Load Image Mask, class type: LoadImage\nExecuting node 50, title: InvertMask, class type: InvertMask\nExecuting node 48, title: Convert Mask to Image, class type: MaskToImage\nExecuting node 54, title: ImageCompositeMasked, class type: ImageCompositeMasked\nRequested to load AutoencoderKL\nExecuting node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced\nLoading 1 new model\nExecuting node 51, title: VAE Encode, class type: VAEEncode\nRequested to load ControlNet\nLoading 1 new model\nExecuting node 3, title: KSampler, class type: KSampler\n 0%| | 0/20 [00:00<?, ?it/s]\n 5%|▌ | 1/20 [00:00<00:06, 2.92it/s]\n 10%|█ | 2/20 [00:00<00:05, 3.32it/s]\n 15%|█▌ | 3/20 [00:00<00:04, 3.49it/s]\n 20%|██ | 4/20 [00:01<00:04, 3.58it/s]\n 25%|██▌ | 5/20 [00:01<00:04, 3.63it/s]\n 30%|███ | 6/20 [00:01<00:03, 3.65it/s]\n 35%|███▌ | 7/20 [00:01<00:03, 3.67it/s]\n 40%|████ | 8/20 [00:02<00:03, 3.68it/s]\n 45%|████▌ | 9/20 [00:02<00:02, 3.69it/s]\n 50%|█████ | 10/20 [00:02<00:02, 3.69it/s]\n 55%|█████▌ | 11/20 [00:03<00:02, 3.70it/s]\n 60%|██████ | 12/20 [00:03<00:02, 3.70it/s]\n 65%|██████▌ | 13/20 [00:03<00:01, 3.70it/s]\n 70%|███████ | 14/20 [00:03<00:01, 3.70it/s]\n 75%|███████▌ | 15/20 [00:04<00:01, 3.70it/s]\n 80%|████████ | 16/20 [00:04<00:01, 3.70it/s]\n 85%|████████▌ | 17/20 [00:04<00:00, 3.70it/s]\n 90%|█████████ | 18/20 [00:04<00:00, 3.70it/s]\n 95%|█████████▌| 19/20 [00:05<00:00, 3.69it/s]\n100%|██████████| 20/20 [00:05<00:00, 3.69it/s]\n100%|██████████| 20/20 [00:05<00:00, 3.65it/s]\nExecuting node 8, title: VAE Decode, class type: VAEDecode\nExecuting node 74, title: Save Image, class type: SaveImage\nPrompt executed in 9.33 seconds\noutputs: {'74': {'images': [{'filename': 'inpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}}\n====================================\ninpainted_00001_.png", "metrics": { "predict_time": 11.740362831, "total_time": 110.248359 }, "output": [ "https://replicate.delivery/pbxt/3dhj4lQa3OanF9G1pliXsqU2jdzyHmMoEbHkwKpgE3lKERzE/inpainted_00001_.webp" ], "started_at": "2024-07-28T18:46:55.915996Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/2f9ncrtbw1rgg0cgzamratmkr8", "cancel": "https://api.replicate.com/v1/predictions/2f9ncrtbw1rgg0cgzamratmkr8/cancel" }, "version": "ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198" }
Generated inRandom seed set to: 1242870280 Checking inputs ✅ /tmp/inputs/image.png ✅ /tmp/inputs/mask.png ==================================== Checking weights ✅ diffusion_pytorch_model_promax.safetensors exists in ComfyUI/models/controlnet ✅ juggernautXL_v8Rundiffusion.safetensors exists in ComfyUI/models/checkpoints ==================================== Running workflow got prompt Executing node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple model_type EPS Using pytorch attention in VAE Using pytorch attention in VAE loaded straight to GPU Requested to load SDXL Loading 1 new model Executing node 6, title: Positive, class type: CLIPTextEncode Requested to load SDXLClipModel Loading 1 new model Executing node 7, title: Negetive, class type: CLIPTextEncode Executing node 16, title: Load ControlNet Model, class type: ControlNetLoader Executing node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType Executing node 11, title: Load Image, class type: LoadImage Executing node 71, title: Load Image Mask, class type: LoadImage Executing node 50, title: InvertMask, class type: InvertMask Executing node 48, title: Convert Mask to Image, class type: MaskToImage Executing node 54, title: ImageCompositeMasked, class type: ImageCompositeMasked Requested to load AutoencoderKL Executing node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced Loading 1 new model Executing node 51, title: VAE Encode, class type: VAEEncode Requested to load ControlNet Loading 1 new model Executing node 3, title: KSampler, class type: KSampler 0%| | 0/20 [00:00<?, ?it/s] 5%|▌ | 1/20 [00:00<00:06, 2.92it/s] 10%|█ | 2/20 [00:00<00:05, 3.32it/s] 15%|█▌ | 3/20 [00:00<00:04, 3.49it/s] 20%|██ | 4/20 [00:01<00:04, 3.58it/s] 25%|██▌ | 5/20 [00:01<00:04, 3.63it/s] 30%|███ | 6/20 [00:01<00:03, 3.65it/s] 35%|███▌ | 7/20 [00:01<00:03, 3.67it/s] 40%|████ | 8/20 [00:02<00:03, 3.68it/s] 45%|████▌ | 9/20 [00:02<00:02, 3.69it/s] 50%|█████ | 10/20 [00:02<00:02, 3.69it/s] 55%|█████▌ | 11/20 [00:03<00:02, 3.70it/s] 60%|██████ | 12/20 [00:03<00:02, 3.70it/s] 65%|██████▌ | 13/20 [00:03<00:01, 3.70it/s] 70%|███████ | 14/20 [00:03<00:01, 3.70it/s] 75%|███████▌ | 15/20 [00:04<00:01, 3.70it/s] 80%|████████ | 16/20 [00:04<00:01, 3.70it/s] 85%|████████▌ | 17/20 [00:04<00:00, 3.70it/s] 90%|█████████ | 18/20 [00:04<00:00, 3.70it/s] 95%|█████████▌| 19/20 [00:05<00:00, 3.69it/s] 100%|██████████| 20/20 [00:05<00:00, 3.69it/s] 100%|██████████| 20/20 [00:05<00:00, 3.65it/s] Executing node 8, title: VAE Decode, class type: VAEDecode Executing node 74, title: Save Image, class type: SaveImage Prompt executed in 9.33 seconds outputs: {'74': {'images': [{'filename': 'inpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}} ==================================== inpainted_00001_.png
Prediction
emaph/inpaint-controlnet-union:ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198IDj0n5szsptnrgg0cgzp08t98sarStatusSucceededSourceWebHardwareA40Total durationCreatedInput
- cfg
- 4
- steps
- 20
- prompt
- portrait of a man wearing a suit with a red tie, colorful background
- output_format
- webp
- output_quality
- 80
- negative_prompt
- ugly, deformed
{ "cfg": 4, "mask": "https://replicate.delivery/pbxt/LLr2HCNTBnKtFrEhBR2uLadsTnkHuGxDvHvZAWeTiznYobFc/header-mask.png", "image": "https://replicate.delivery/pbxt/LLr2HPn4G84jrgoHfLGJpRzHnVTgqXNtshyqsSUth9ng8SyF/2024-03-29_12-46-01_4057.png", "steps": 20, "prompt": "portrait of a man wearing a suit with a red tie, colorful background", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly, deformed" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run emaph/inpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "emaph/inpaint-controlnet-union:ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198", { input: { cfg: 4, mask: "https://replicate.delivery/pbxt/LLr2HCNTBnKtFrEhBR2uLadsTnkHuGxDvHvZAWeTiznYobFc/header-mask.png", image: "https://replicate.delivery/pbxt/LLr2HPn4G84jrgoHfLGJpRzHnVTgqXNtshyqsSUth9ng8SyF/2024-03-29_12-46-01_4057.png", steps: 20, prompt: "portrait of a man wearing a suit with a red tie, colorful background", output_format: "webp", output_quality: 80, negative_prompt: "ugly, deformed" } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run emaph/inpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "emaph/inpaint-controlnet-union:ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198", input={ "cfg": 4, "mask": "https://replicate.delivery/pbxt/LLr2HCNTBnKtFrEhBR2uLadsTnkHuGxDvHvZAWeTiznYobFc/header-mask.png", "image": "https://replicate.delivery/pbxt/LLr2HPn4G84jrgoHfLGJpRzHnVTgqXNtshyqsSUth9ng8SyF/2024-03-29_12-46-01_4057.png", "steps": 20, "prompt": "portrait of a man wearing a suit with a red tie, colorful background", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly, deformed" } ) # To access the file URL: print(output[0].url()) #=> "http://example.com" # To write the file to disk: with open("my-image.png", "wb") as file: file.write(output[0].read())
To learn more, take a look at the guide on getting started with Python.
Run emaph/inpaint-controlnet-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "emaph/inpaint-controlnet-union:ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198", "input": { "cfg": 4, "mask": "https://replicate.delivery/pbxt/LLr2HCNTBnKtFrEhBR2uLadsTnkHuGxDvHvZAWeTiznYobFc/header-mask.png", "image": "https://replicate.delivery/pbxt/LLr2HPn4G84jrgoHfLGJpRzHnVTgqXNtshyqsSUth9ng8SyF/2024-03-29_12-46-01_4057.png", "steps": 20, "prompt": "portrait of a man wearing a suit with a red tie, colorful background", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly, deformed" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/emaph/inpaint-controlnet-union@sha256:ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198 \ -i 'cfg=4' \ -i 'mask="https://replicate.delivery/pbxt/LLr2HCNTBnKtFrEhBR2uLadsTnkHuGxDvHvZAWeTiznYobFc/header-mask.png"' \ -i 'image="https://replicate.delivery/pbxt/LLr2HPn4G84jrgoHfLGJpRzHnVTgqXNtshyqsSUth9ng8SyF/2024-03-29_12-46-01_4057.png"' \ -i 'steps=20' \ -i 'prompt="portrait of a man wearing a suit with a red tie, colorful background"' \ -i 'output_format="webp"' \ -i 'output_quality=80' \ -i 'negative_prompt="ugly, deformed"'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/emaph/inpaint-controlnet-union@sha256:ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "cfg": 4, "mask": "https://replicate.delivery/pbxt/LLr2HCNTBnKtFrEhBR2uLadsTnkHuGxDvHvZAWeTiznYobFc/header-mask.png", "image": "https://replicate.delivery/pbxt/LLr2HPn4G84jrgoHfLGJpRzHnVTgqXNtshyqsSUth9ng8SyF/2024-03-29_12-46-01_4057.png", "steps": 20, "prompt": "portrait of a man wearing a suit with a red tie, colorful background", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly, deformed" } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2024-07-29T08:01:05.890662Z", "created_at": "2024-07-29T07:59:16.693000Z", "data_removed": false, "error": null, "id": "j0n5szsptnrgg0cgzp08t98sar", "input": { "cfg": 4, "mask": "https://replicate.delivery/pbxt/LLr2HCNTBnKtFrEhBR2uLadsTnkHuGxDvHvZAWeTiznYobFc/header-mask.png", "image": "https://replicate.delivery/pbxt/LLr2HPn4G84jrgoHfLGJpRzHnVTgqXNtshyqsSUth9ng8SyF/2024-03-29_12-46-01_4057.png", "steps": 20, "prompt": "portrait of a man wearing a suit with a red tie, colorful background", "output_format": "webp", "output_quality": 80, "negative_prompt": "ugly, deformed" }, "logs": "Random seed set to: 951636176\nChecking inputs\n✅ /tmp/inputs/image.png\n✅ /tmp/inputs/mask.png\n====================================\nChecking weights\n✅ juggernautXL_v8Rundiffusion.safetensors exists in ComfyUI/models/checkpoints\n✅ diffusion_pytorch_model_promax.safetensors exists in ComfyUI/models/controlnet\n====================================\nRunning workflow\ngot prompt\nExecuting node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple\nmodel_type EPS\nUsing pytorch attention in VAE\nUsing pytorch attention in VAE\nloaded straight to GPU\nRequested to load SDXL\nLoading 1 new model\nExecuting node 6, title: Positive, class type: CLIPTextEncode\nRequested to load SDXLClipModel\nLoading 1 new model\nExecuting node 7, title: Negetive, class type: CLIPTextEncode\nExecuting node 16, title: Load ControlNet Model, class type: ControlNetLoader\nExecuting node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType\nExecuting node 11, title: Load Image, class type: LoadImage\nExecuting node 71, title: Load Image Mask, class type: LoadImage\nExecuting node 50, title: InvertMask, class type: InvertMask\nExecuting node 48, title: Convert Mask to Image, class type: MaskToImage\nExecuting node 54, title: ImageCompositeMasked, class type: ImageCompositeMasked\nExecuting node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced\nExecuting node 51, title: VAE Encode, class type: VAEEncode\nRequested to load AutoencoderKL\nLoading 1 new model\nExecuting node 3, title: KSampler, class type: KSampler\nRequested to load ControlNet\nLoading 1 new model\n 0%| | 0/20 [00:00<?, ?it/s]\n 5%|▌ | 1/20 [00:00<00:06, 2.79it/s]\n 10%|█ | 2/20 [00:00<00:05, 3.20it/s]\n 15%|█▌ | 3/20 [00:00<00:05, 3.39it/s]\n 20%|██ | 4/20 [00:01<00:04, 3.49it/s]\n 25%|██▌ | 5/20 [00:01<00:04, 3.55it/s]\n 30%|███ | 6/20 [00:01<00:03, 3.58it/s]\n 35%|███▌ | 7/20 [00:02<00:03, 3.60it/s]\n 40%|████ | 8/20 [00:02<00:03, 3.61it/s]\n 45%|████▌ | 9/20 [00:02<00:03, 3.62it/s]\n 50%|█████ | 10/20 [00:02<00:02, 3.63it/s]\n 55%|█████▌ | 11/20 [00:03<00:02, 3.64it/s]\n 60%|██████ | 12/20 [00:03<00:02, 3.64it/s]\n 65%|██████▌ | 13/20 [00:03<00:01, 3.64it/s]\n 70%|███████ | 14/20 [00:03<00:01, 3.64it/s]\n 75%|███████▌ | 15/20 [00:04<00:01, 3.64it/s]\n 80%|████████ | 16/20 [00:04<00:01, 3.50it/s]\n 85%|████████▌ | 17/20 [00:04<00:00, 3.54it/s]\n 90%|█████████ | 18/20 [00:05<00:00, 3.57it/s]\n 95%|█████████▌| 19/20 [00:05<00:00, 3.59it/s]\n100%|██████████| 20/20 [00:05<00:00, 3.60it/s]\n100%|██████████| 20/20 [00:05<00:00, 3.56it/s]\nExecuting node 8, title: VAE Decode, class type: VAEDecode\nExecuting node 74, title: Save Image, class type: SaveImage\nPrompt executed in 11.09 seconds\noutputs: {'74': {'images': [{'filename': 'inpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}}\n====================================\ninpainted_00001_.png", "metrics": { "predict_time": 13.635629639, "total_time": 109.197662 }, "output": [ "https://replicate.delivery/pbxt/uFoE1nACwpJqKBzihkZFyLUMIWrfGu4AsouuZmmkUotg8nmJA/inpainted_00001_.webp" ], "started_at": "2024-07-29T08:00:52.255032Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/j0n5szsptnrgg0cgzp08t98sar", "cancel": "https://api.replicate.com/v1/predictions/j0n5szsptnrgg0cgzp08t98sar/cancel" }, "version": "ac1ab4b5a10920a8abc96c6b4fdc6cf09cc9b051256f8234c235c3c0efb63198" }
Generated inRandom seed set to: 951636176 Checking inputs ✅ /tmp/inputs/image.png ✅ /tmp/inputs/mask.png ==================================== Checking weights ✅ juggernautXL_v8Rundiffusion.safetensors exists in ComfyUI/models/checkpoints ✅ diffusion_pytorch_model_promax.safetensors exists in ComfyUI/models/controlnet ==================================== Running workflow got prompt Executing node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple model_type EPS Using pytorch attention in VAE Using pytorch attention in VAE loaded straight to GPU Requested to load SDXL Loading 1 new model Executing node 6, title: Positive, class type: CLIPTextEncode Requested to load SDXLClipModel Loading 1 new model Executing node 7, title: Negetive, class type: CLIPTextEncode Executing node 16, title: Load ControlNet Model, class type: ControlNetLoader Executing node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType Executing node 11, title: Load Image, class type: LoadImage Executing node 71, title: Load Image Mask, class type: LoadImage Executing node 50, title: InvertMask, class type: InvertMask Executing node 48, title: Convert Mask to Image, class type: MaskToImage Executing node 54, title: ImageCompositeMasked, class type: ImageCompositeMasked Executing node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced Executing node 51, title: VAE Encode, class type: VAEEncode Requested to load AutoencoderKL Loading 1 new model Executing node 3, title: KSampler, class type: KSampler Requested to load ControlNet Loading 1 new model 0%| | 0/20 [00:00<?, ?it/s] 5%|▌ | 1/20 [00:00<00:06, 2.79it/s] 10%|█ | 2/20 [00:00<00:05, 3.20it/s] 15%|█▌ | 3/20 [00:00<00:05, 3.39it/s] 20%|██ | 4/20 [00:01<00:04, 3.49it/s] 25%|██▌ | 5/20 [00:01<00:04, 3.55it/s] 30%|███ | 6/20 [00:01<00:03, 3.58it/s] 35%|███▌ | 7/20 [00:02<00:03, 3.60it/s] 40%|████ | 8/20 [00:02<00:03, 3.61it/s] 45%|████▌ | 9/20 [00:02<00:03, 3.62it/s] 50%|█████ | 10/20 [00:02<00:02, 3.63it/s] 55%|█████▌ | 11/20 [00:03<00:02, 3.64it/s] 60%|██████ | 12/20 [00:03<00:02, 3.64it/s] 65%|██████▌ | 13/20 [00:03<00:01, 3.64it/s] 70%|███████ | 14/20 [00:03<00:01, 3.64it/s] 75%|███████▌ | 15/20 [00:04<00:01, 3.64it/s] 80%|████████ | 16/20 [00:04<00:01, 3.50it/s] 85%|████████▌ | 17/20 [00:04<00:00, 3.54it/s] 90%|█████████ | 18/20 [00:05<00:00, 3.57it/s] 95%|█████████▌| 19/20 [00:05<00:00, 3.59it/s] 100%|██████████| 20/20 [00:05<00:00, 3.60it/s] 100%|██████████| 20/20 [00:05<00:00, 3.56it/s] Executing node 8, title: VAE Decode, class type: VAEDecode Executing node 74, title: Save Image, class type: SaveImage Prompt executed in 11.09 seconds outputs: {'74': {'images': [{'filename': 'inpainted_00001_.png', 'subfolder': '', 'type': 'output'}]}} ==================================== inpainted_00001_.png
Want to make some of these yourself?
Run this model