emaph / in-and-outpaint-union
Combines inpainting and outpainting for image editing.
- Public
- 301 runs
-
L40S
Prediction
emaph/in-and-outpaint-union:3a23360cf7a5675e7f812b477f8c6b0e02266ddc23dbebb3acfa9abd46042f69ID739bc641dsrgj0cgzc7s09k8arStatusSucceededSourceWebHardwareA40Total durationCreatedInput
- cfg
- 4
- top
- 150
- left
- 300
- right
- 300
- steps
- 20
- bottom
- 150
- prompt
- a man wearing a pink shirt, in the middle of a foggy new york, skyscrapers on the side
- output_format
- webp
- output_quality
- 80
- negative_prompt
- text, watermark
{ "cfg": 4, "top": 150, "left": 300, "mask": "https://replicate.delivery/pbxt/LLgbikVFw1CPcptesJFntZIbdDTdJgT96TgSkmInPPiAwQAp/clipspace-mask-36011.png", "image": "https://replicate.delivery/pbxt/LLgbi97wIWxt0nUNa0wu9lZbSb0UjRnkX7Xijy6NNmZaTEdL/inoutin.png", "right": 300, "steps": 20, "bottom": 150, "prompt": "a man wearing a pink shirt, in the middle of a foggy new york, skyscrapers on the side", "output_format": "webp", "output_quality": 80, "negative_prompt": "text, watermark" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run emaph/in-and-outpaint-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "emaph/in-and-outpaint-union:3a23360cf7a5675e7f812b477f8c6b0e02266ddc23dbebb3acfa9abd46042f69", { input: { cfg: 4, top: 150, left: 300, mask: "https://replicate.delivery/pbxt/LLgbikVFw1CPcptesJFntZIbdDTdJgT96TgSkmInPPiAwQAp/clipspace-mask-36011.png", image: "https://replicate.delivery/pbxt/LLgbi97wIWxt0nUNa0wu9lZbSb0UjRnkX7Xijy6NNmZaTEdL/inoutin.png", right: 300, steps: 20, bottom: 150, prompt: "a man wearing a pink shirt, in the middle of a foggy new york, skyscrapers on the side", output_format: "webp", output_quality: 80, negative_prompt: "text, watermark" } } ); // To access the file URL: console.log(output[0].url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output[0]);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run emaph/in-and-outpaint-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "emaph/in-and-outpaint-union:3a23360cf7a5675e7f812b477f8c6b0e02266ddc23dbebb3acfa9abd46042f69", input={ "cfg": 4, "top": 150, "left": 300, "mask": "https://replicate.delivery/pbxt/LLgbikVFw1CPcptesJFntZIbdDTdJgT96TgSkmInPPiAwQAp/clipspace-mask-36011.png", "image": "https://replicate.delivery/pbxt/LLgbi97wIWxt0nUNa0wu9lZbSb0UjRnkX7Xijy6NNmZaTEdL/inoutin.png", "right": 300, "steps": 20, "bottom": 150, "prompt": "a man wearing a pink shirt, in the middle of a foggy new york, skyscrapers on the side", "output_format": "webp", "output_quality": 80, "negative_prompt": "text, watermark" } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run emaph/in-and-outpaint-union using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "emaph/in-and-outpaint-union:3a23360cf7a5675e7f812b477f8c6b0e02266ddc23dbebb3acfa9abd46042f69", "input": { "cfg": 4, "top": 150, "left": 300, "mask": "https://replicate.delivery/pbxt/LLgbikVFw1CPcptesJFntZIbdDTdJgT96TgSkmInPPiAwQAp/clipspace-mask-36011.png", "image": "https://replicate.delivery/pbxt/LLgbi97wIWxt0nUNa0wu9lZbSb0UjRnkX7Xijy6NNmZaTEdL/inoutin.png", "right": 300, "steps": 20, "bottom": 150, "prompt": "a man wearing a pink shirt, in the middle of a foggy new york, skyscrapers on the side", "output_format": "webp", "output_quality": 80, "negative_prompt": "text, watermark" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
Output
{ "completed_at": "2024-07-28T20:39:55.149341Z", "created_at": "2024-07-28T20:36:55.790000Z", "data_removed": false, "error": null, "id": "739bc641dsrgj0cgzc7s09k8ar", "input": { "cfg": 4, "top": 150, "left": 300, "mask": "https://replicate.delivery/pbxt/LLgbikVFw1CPcptesJFntZIbdDTdJgT96TgSkmInPPiAwQAp/clipspace-mask-36011.png", "image": "https://replicate.delivery/pbxt/LLgbi97wIWxt0nUNa0wu9lZbSb0UjRnkX7Xijy6NNmZaTEdL/inoutin.png", "right": 300, "steps": 20, "bottom": 150, "prompt": "a man wearing a pink shirt, in the middle of a foggy new york, skyscrapers on the side", "output_format": "webp", "output_quality": 80, "negative_prompt": "text, watermark" }, "logs": "Random seed set to: 286931823\nChecking inputs\n✅ /tmp/inputs/image.png\n✅ /tmp/inputs/mask.png\n====================================\nChecking weights\n✅ diffusion_pytorch_model_promax.safetensors exists in ComfyUI/models/controlnet\n✅ juggernautXL_v8Rundiffusion.safetensors exists in ComfyUI/models/checkpoints\n====================================\nRunning workflow\ngot prompt\nExecuting node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple\nmodel_type EPS\nUsing pytorch attention in VAE\nUsing pytorch attention in VAE\nloaded straight to GPU\nRequested to load SDXL\nLoading 1 new model\nExecuting node 6, title: Positive, class type: CLIPTextEncode\nRequested to load SDXLClipModel\nLoading 1 new model\nExecuting node 7, title: Negetive, class type: CLIPTextEncode\nExecuting node 16, title: Load ControlNet Model, class type: ControlNetLoader\nExecuting node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType\nExecuting node 11, title: Load Image, class type: LoadImage\nExecuting node 71, title: Load Image Mask, class type: LoadImage\nExecuting node 50, title: InvertMask, class type: InvertMask\nExecuting node 48, title: Convert Mask to Image, class type: MaskToImage\nExecuting node 54, title: ImageCompositeMasked, class type: ImageCompositeMasked\nExecuting node 75, title: Pad Image for Outpainting, class type: ImagePadForOutpaint\nExecuting node 78, title: InvertMask, class type: InvertMask\nExecuting node 77, title: Convert Mask to Image, class type: MaskToImage\nExecuting node 79, title: ImageCompositeMasked, class type: ImageCompositeMasked\nExecuting node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced\nExecuting node 51, title: VAE Encode, class type: VAEEncode\nRequested to load AutoencoderKL\nLoading 1 new model\nRequested to load ControlNet\nLoading 1 new model\nExecuting node 3, title: KSampler, class type: KSampler\n 0%| | 0/20 [00:00<?, ?it/s]\n 5%|▌ | 1/20 [00:00<00:12, 1.57it/s]\n 10%|█ | 2/20 [00:01<00:10, 1.68it/s]\n 15%|█▌ | 3/20 [00:01<00:09, 1.72it/s]\n 20%|██ | 4/20 [00:02<00:09, 1.74it/s]\n 25%|██▌ | 5/20 [00:02<00:08, 1.75it/s]\n 30%|███ | 6/20 [00:03<00:07, 1.75it/s]\n 35%|███▌ | 7/20 [00:04<00:07, 1.76it/s]\n 40%|████ | 8/20 [00:04<00:06, 1.76it/s]\n 45%|████▌ | 9/20 [00:05<00:06, 1.76it/s]\n 50%|█████ | 10/20 [00:05<00:05, 1.76it/s]\n 55%|█████▌ | 11/20 [00:06<00:05, 1.76it/s]\n 60%|██████ | 12/20 [00:06<00:04, 1.76it/s]\n 65%|██████▌ | 13/20 [00:07<00:03, 1.77it/s]\n 70%|███████ | 14/20 [00:07<00:03, 1.77it/s]\n 75%|███████▌ | 15/20 [00:08<00:02, 1.77it/s]\n 80%|████████ | 16/20 [00:09<00:02, 1.58it/s]\n 85%|████████▌ | 17/20 [00:09<00:01, 1.62it/s]\n 90%|█████████ | 18/20 [00:10<00:01, 1.46it/s]\n 95%|█████████▌| 19/20 [00:11<00:00, 1.41it/s]\n100%|██████████| 20/20 [00:12<00:00, 1.50it/s]\n100%|██████████| 20/20 [00:12<00:00, 1.65it/s]\nExecuting node 8, title: VAE Decode, class type: VAEDecode\nExecuting node 74, title: Save Image, class type: SaveImage\nPrompt executed in 16.87 seconds\noutputs: {'74': {'images': [{'filename': 'edited_00001_.png', 'subfolder': '', 'type': 'output'}]}}\n====================================\nedited_00001_.png", "metrics": { "predict_time": 20.844022075, "total_time": 179.359341 }, "output": [ "https://replicate.delivery/pbxt/1UGdvhFDzDZKNl6pb4Jbzaq98cRvACBesCLlH24TLIIN9imJA/edited_00001_.webp" ], "started_at": "2024-07-28T20:39:34.305319Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/739bc641dsrgj0cgzc7s09k8ar", "cancel": "https://api.replicate.com/v1/predictions/739bc641dsrgj0cgzc7s09k8ar/cancel" }, "version": "3a23360cf7a5675e7f812b477f8c6b0e02266ddc23dbebb3acfa9abd46042f69" }
Generated inRandom seed set to: 286931823 Checking inputs ✅ /tmp/inputs/image.png ✅ /tmp/inputs/mask.png ==================================== Checking weights ✅ diffusion_pytorch_model_promax.safetensors exists in ComfyUI/models/controlnet ✅ juggernautXL_v8Rundiffusion.safetensors exists in ComfyUI/models/checkpoints ==================================== Running workflow got prompt Executing node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple model_type EPS Using pytorch attention in VAE Using pytorch attention in VAE loaded straight to GPU Requested to load SDXL Loading 1 new model Executing node 6, title: Positive, class type: CLIPTextEncode Requested to load SDXLClipModel Loading 1 new model Executing node 7, title: Negetive, class type: CLIPTextEncode Executing node 16, title: Load ControlNet Model, class type: ControlNetLoader Executing node 17, title: SetUnionControlNetType, class type: SetUnionControlNetType Executing node 11, title: Load Image, class type: LoadImage Executing node 71, title: Load Image Mask, class type: LoadImage Executing node 50, title: InvertMask, class type: InvertMask Executing node 48, title: Convert Mask to Image, class type: MaskToImage Executing node 54, title: ImageCompositeMasked, class type: ImageCompositeMasked Executing node 75, title: Pad Image for Outpainting, class type: ImagePadForOutpaint Executing node 78, title: InvertMask, class type: InvertMask Executing node 77, title: Convert Mask to Image, class type: MaskToImage Executing node 79, title: ImageCompositeMasked, class type: ImageCompositeMasked Executing node 15, title: Apply ControlNet (Advanced), class type: ControlNetApplyAdvanced Executing node 51, title: VAE Encode, class type: VAEEncode Requested to load AutoencoderKL Loading 1 new model Requested to load ControlNet Loading 1 new model Executing node 3, title: KSampler, class type: KSampler 0%| | 0/20 [00:00<?, ?it/s] 5%|▌ | 1/20 [00:00<00:12, 1.57it/s] 10%|█ | 2/20 [00:01<00:10, 1.68it/s] 15%|█▌ | 3/20 [00:01<00:09, 1.72it/s] 20%|██ | 4/20 [00:02<00:09, 1.74it/s] 25%|██▌ | 5/20 [00:02<00:08, 1.75it/s] 30%|███ | 6/20 [00:03<00:07, 1.75it/s] 35%|███▌ | 7/20 [00:04<00:07, 1.76it/s] 40%|████ | 8/20 [00:04<00:06, 1.76it/s] 45%|████▌ | 9/20 [00:05<00:06, 1.76it/s] 50%|█████ | 10/20 [00:05<00:05, 1.76it/s] 55%|█████▌ | 11/20 [00:06<00:05, 1.76it/s] 60%|██████ | 12/20 [00:06<00:04, 1.76it/s] 65%|██████▌ | 13/20 [00:07<00:03, 1.77it/s] 70%|███████ | 14/20 [00:07<00:03, 1.77it/s] 75%|███████▌ | 15/20 [00:08<00:02, 1.77it/s] 80%|████████ | 16/20 [00:09<00:02, 1.58it/s] 85%|████████▌ | 17/20 [00:09<00:01, 1.62it/s] 90%|█████████ | 18/20 [00:10<00:01, 1.46it/s] 95%|█████████▌| 19/20 [00:11<00:00, 1.41it/s] 100%|██████████| 20/20 [00:12<00:00, 1.50it/s] 100%|██████████| 20/20 [00:12<00:00, 1.65it/s] Executing node 8, title: VAE Decode, class type: VAEDecode Executing node 74, title: Save Image, class type: SaveImage Prompt executed in 16.87 seconds outputs: {'74': {'images': [{'filename': 'edited_00001_.png', 'subfolder': '', 'type': 'output'}]}} ==================================== edited_00001_.png
Want to make some of these yourself?
Run this model