omniedgeio / virtual-dressing
- Public
- 472 runs
-
A100 (80GB)
Prediction
omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338IDpnlitubbsijhrk2eloitp5p4siStatusSucceededSourceWebHardwareA100 (40GB)Total durationCreatedInput
{ "seed": 0, "steps": 30, "model_image": "https://replicate.delivery/pbxt/KXZRucHM1bbj49bWhUsZRw9GD4DYpgbHuuWyknmIybMMrHWG/ltfp4dlj.jpeg", "garment_image": "https://replicate.delivery/pbxt/KXZRuw8l44Q7wIML5dhP5o47l2BEpENFQ1VpDejfUlk3nWCh/IMG_1210.jpg", "guidance_scale": 3 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run omniedgeio/virtual-dressing using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338", { input: { seed: 0, steps: 30, model_image: "https://replicate.delivery/pbxt/KXZRucHM1bbj49bWhUsZRw9GD4DYpgbHuuWyknmIybMMrHWG/ltfp4dlj.jpeg", garment_image: "https://replicate.delivery/pbxt/KXZRuw8l44Q7wIML5dhP5o47l2BEpENFQ1VpDejfUlk3nWCh/IMG_1210.jpg", guidance_scale: 3 } } ); // To access the file URL: console.log(output.url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run omniedgeio/virtual-dressing using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338", input={ "seed": 0, "steps": 30, "model_image": "https://replicate.delivery/pbxt/KXZRucHM1bbj49bWhUsZRw9GD4DYpgbHuuWyknmIybMMrHWG/ltfp4dlj.jpeg", "garment_image": "https://replicate.delivery/pbxt/KXZRuw8l44Q7wIML5dhP5o47l2BEpENFQ1VpDejfUlk3nWCh/IMG_1210.jpg", "guidance_scale": 3 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run omniedgeio/virtual-dressing using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338", "input": { "seed": 0, "steps": 30, "model_image": "https://replicate.delivery/pbxt/KXZRucHM1bbj49bWhUsZRw9GD4DYpgbHuuWyknmIybMMrHWG/ltfp4dlj.jpeg", "garment_image": "https://replicate.delivery/pbxt/KXZRuw8l44Q7wIML5dhP5o47l2BEpENFQ1VpDejfUlk3nWCh/IMG_1210.jpg", "guidance_scale": 3 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/omniedgeio/virtual-dressing@sha256:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338 \ -i 'seed=0' \ -i 'steps=30' \ -i 'model_image="https://replicate.delivery/pbxt/KXZRucHM1bbj49bWhUsZRw9GD4DYpgbHuuWyknmIybMMrHWG/ltfp4dlj.jpeg"' \ -i 'garment_image="https://replicate.delivery/pbxt/KXZRuw8l44Q7wIML5dhP5o47l2BEpENFQ1VpDejfUlk3nWCh/IMG_1210.jpg"' \ -i 'guidance_scale=3'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/omniedgeio/virtual-dressing@sha256:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "seed": 0, "steps": 30, "model_image": "https://replicate.delivery/pbxt/KXZRucHM1bbj49bWhUsZRw9GD4DYpgbHuuWyknmIybMMrHWG/ltfp4dlj.jpeg", "garment_image": "https://replicate.delivery/pbxt/KXZRuw8l44Q7wIML5dhP5o47l2BEpENFQ1VpDejfUlk3nWCh/IMG_1210.jpg", "guidance_scale": 3 } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2024-03-09T16:11:12.545285Z", "created_at": "2024-03-09T16:06:42.585097Z", "data_removed": false, "error": null, "id": "pnlitubbsijhrk2eloitp5p4si", "input": { "seed": 0, "steps": 30, "model_image": "https://replicate.delivery/pbxt/KXZRucHM1bbj49bWhUsZRw9GD4DYpgbHuuWyknmIybMMrHWG/ltfp4dlj.jpeg", "garment_image": "https://replicate.delivery/pbxt/KXZRuw8l44Q7wIML5dhP5o47l2BEpENFQ1VpDejfUlk3nWCh/IMG_1210.jpg", "guidance_scale": 3 }, "logs": "Model parse in 0.67 seconds.\n0: 640x480 1 person, 153.6ms\nSpeed: 8.1ms preprocess, 153.6ms inference, 508.6ms postprocess per image at shape (1, 3, 640, 480)\nOpen pose in 1.25 seconds.\nInitial seed: 0\n 0%| | 0/30 [00:00<?, ?it/s]\n 3%|▎ | 1/30 [00:00<00:05, 5.79it/s]\n 7%|▋ | 2/30 [00:00<00:04, 6.32it/s]\n 10%|█ | 3/30 [00:00<00:04, 6.00it/s]\n 13%|█▎ | 4/30 [00:00<00:04, 6.30it/s]\n 17%|█▋ | 5/30 [00:00<00:03, 6.48it/s]\n 20%|██ | 6/30 [00:00<00:03, 6.59it/s]\n 23%|██▎ | 7/30 [00:01<00:03, 6.67it/s]\n 27%|██▋ | 8/30 [00:01<00:03, 6.71it/s]\n 30%|███ | 9/30 [00:01<00:03, 6.76it/s]\n 33%|███▎ | 10/30 [00:01<00:02, 6.77it/s]\n 37%|███▋ | 11/30 [00:01<00:02, 6.77it/s]\n 40%|████ | 12/30 [00:01<00:02, 6.79it/s]\n 43%|████▎ | 13/30 [00:01<00:02, 6.80it/s]\n 47%|████▋ | 14/30 [00:02<00:02, 6.79it/s]\n 50%|█████ | 15/30 [00:02<00:02, 6.79it/s]\n 53%|█████▎ | 16/30 [00:02<00:02, 6.79it/s]\n 57%|█████▋ | 17/30 [00:02<00:01, 6.79it/s]\n 60%|██████ | 18/30 [00:02<00:01, 6.79it/s]\n 63%|██████▎ | 19/30 [00:02<00:01, 6.81it/s]\n 67%|██████▋ | 20/30 [00:02<00:01, 6.81it/s]\n 70%|███████ | 21/30 [00:03<00:01, 6.82it/s]\n 73%|███████▎ | 22/30 [00:03<00:01, 6.85it/s]\n 77%|███████▋ | 23/30 [00:03<00:01, 6.83it/s]\n 80%|████████ | 24/30 [00:03<00:00, 6.83it/s]\n 83%|████████▎ | 25/30 [00:03<00:00, 6.83it/s]\n 87%|████████▋ | 26/30 [00:03<00:00, 6.83it/s]\n 90%|█████████ | 27/30 [00:04<00:00, 6.83it/s]\n 93%|█████████▎| 28/30 [00:04<00:00, 6.83it/s]\n 97%|█████████▋| 29/30 [00:04<00:00, 6.83it/s]\n100%|██████████| 30/30 [00:04<00:00, 6.82it/s]\n100%|██████████| 30/30 [00:04<00:00, 6.73it/s]", "metrics": { "predict_time": 8.157579, "total_time": 269.960188 }, "output": "https://replicate.delivery/pbxt/wlxnTmA9zMYDLB3vYKeuLSkFtMMwADiNxWLA9JLf6wEgwjekA/tmphdt92l5m.jpg", "started_at": "2024-03-09T16:11:04.387706Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/pnlitubbsijhrk2eloitp5p4si", "cancel": "https://api.replicate.com/v1/predictions/pnlitubbsijhrk2eloitp5p4si/cancel" }, "version": "d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338" }
Generated inModel parse in 0.67 seconds. 0: 640x480 1 person, 153.6ms Speed: 8.1ms preprocess, 153.6ms inference, 508.6ms postprocess per image at shape (1, 3, 640, 480) Open pose in 1.25 seconds. Initial seed: 0 0%| | 0/30 [00:00<?, ?it/s] 3%|▎ | 1/30 [00:00<00:05, 5.79it/s] 7%|▋ | 2/30 [00:00<00:04, 6.32it/s] 10%|█ | 3/30 [00:00<00:04, 6.00it/s] 13%|█▎ | 4/30 [00:00<00:04, 6.30it/s] 17%|█▋ | 5/30 [00:00<00:03, 6.48it/s] 20%|██ | 6/30 [00:00<00:03, 6.59it/s] 23%|██▎ | 7/30 [00:01<00:03, 6.67it/s] 27%|██▋ | 8/30 [00:01<00:03, 6.71it/s] 30%|███ | 9/30 [00:01<00:03, 6.76it/s] 33%|███▎ | 10/30 [00:01<00:02, 6.77it/s] 37%|███▋ | 11/30 [00:01<00:02, 6.77it/s] 40%|████ | 12/30 [00:01<00:02, 6.79it/s] 43%|████▎ | 13/30 [00:01<00:02, 6.80it/s] 47%|████▋ | 14/30 [00:02<00:02, 6.79it/s] 50%|█████ | 15/30 [00:02<00:02, 6.79it/s] 53%|█████▎ | 16/30 [00:02<00:02, 6.79it/s] 57%|█████▋ | 17/30 [00:02<00:01, 6.79it/s] 60%|██████ | 18/30 [00:02<00:01, 6.79it/s] 63%|██████▎ | 19/30 [00:02<00:01, 6.81it/s] 67%|██████▋ | 20/30 [00:02<00:01, 6.81it/s] 70%|███████ | 21/30 [00:03<00:01, 6.82it/s] 73%|███████▎ | 22/30 [00:03<00:01, 6.85it/s] 77%|███████▋ | 23/30 [00:03<00:01, 6.83it/s] 80%|████████ | 24/30 [00:03<00:00, 6.83it/s] 83%|████████▎ | 25/30 [00:03<00:00, 6.83it/s] 87%|████████▋ | 26/30 [00:03<00:00, 6.83it/s] 90%|█████████ | 27/30 [00:04<00:00, 6.83it/s] 93%|█████████▎| 28/30 [00:04<00:00, 6.83it/s] 97%|█████████▋| 29/30 [00:04<00:00, 6.83it/s] 100%|██████████| 30/30 [00:04<00:00, 6.82it/s] 100%|██████████| 30/30 [00:04<00:00, 6.73it/s]
Prediction
omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338IDjsd3t4bbhf65nvrtjsbatnlv6mStatusSucceededSourceAPIHardwareA100 (40GB)Total durationCreatedInput
{ "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/93d6cb35-7dcf-442f-9edc-191354e3ee78/ltx8m5u5.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/e0857e72-e2c7-47e2-b6de-f0e6372e7abf/ltx8maza.jpeg", "guidance_scale": 3 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run omniedgeio/virtual-dressing using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338", { input: { seed: 0, steps: 30, model_image: "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/93d6cb35-7dcf-442f-9edc-191354e3ee78/ltx8m5u5.jpeg", garment_image: "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/e0857e72-e2c7-47e2-b6de-f0e6372e7abf/ltx8maza.jpeg", guidance_scale: 3 } } ); // To access the file URL: console.log(output.url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run omniedgeio/virtual-dressing using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338", input={ "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/93d6cb35-7dcf-442f-9edc-191354e3ee78/ltx8m5u5.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/e0857e72-e2c7-47e2-b6de-f0e6372e7abf/ltx8maza.jpeg", "guidance_scale": 3 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run omniedgeio/virtual-dressing using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338", "input": { "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/93d6cb35-7dcf-442f-9edc-191354e3ee78/ltx8m5u5.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/e0857e72-e2c7-47e2-b6de-f0e6372e7abf/ltx8maza.jpeg", "guidance_scale": 3 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/omniedgeio/virtual-dressing@sha256:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338 \ -i 'seed=0' \ -i 'steps=30' \ -i 'model_image="https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/93d6cb35-7dcf-442f-9edc-191354e3ee78/ltx8m5u5.jpeg"' \ -i 'garment_image="https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/e0857e72-e2c7-47e2-b6de-f0e6372e7abf/ltx8maza.jpeg"' \ -i 'guidance_scale=3'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/omniedgeio/virtual-dressing@sha256:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/93d6cb35-7dcf-442f-9edc-191354e3ee78/ltx8m5u5.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/e0857e72-e2c7-47e2-b6de-f0e6372e7abf/ltx8maza.jpeg", "guidance_scale": 3 } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2024-03-18T17:50:15.708267Z", "created_at": "2024-03-18T17:46:40.114938Z", "data_removed": false, "error": null, "id": "jsd3t4bbhf65nvrtjsbatnlv6m", "input": { "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/93d6cb35-7dcf-442f-9edc-191354e3ee78/ltx8m5u5.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/e0857e72-e2c7-47e2-b6de-f0e6372e7abf/ltx8maza.jpeg", "guidance_scale": 3 }, "logs": "Model parse in 0.29 seconds.\n0: 640x480 8 persons, 11.8ms\nSpeed: 2.2ms preprocess, 11.8ms inference, 2.4ms postprocess per image at shape (1, 3, 640, 480)\nOpen pose in 0.02 seconds.\nInitial seed: 0\n 0%| | 0/30 [00:00<?, ?it/s]\n 3%|▎ | 1/30 [00:00<00:04, 6.43it/s]\n 7%|▋ | 2/30 [00:00<00:04, 6.71it/s]\n 10%|█ | 3/30 [00:00<00:03, 6.77it/s]\n 13%|█▎ | 4/30 [00:00<00:03, 6.81it/s]\n 17%|█▋ | 5/30 [00:00<00:03, 6.82it/s]\n 20%|██ | 6/30 [00:00<00:03, 6.83it/s]\n 23%|██▎ | 7/30 [00:01<00:03, 6.84it/s]\n 27%|██▋ | 8/30 [00:01<00:03, 6.85it/s]\n 30%|███ | 9/30 [00:01<00:03, 6.85it/s]\n 33%|███▎ | 10/30 [00:01<00:02, 6.85it/s]\n 37%|███▋ | 11/30 [00:01<00:02, 6.85it/s]\n 40%|████ | 12/30 [00:01<00:02, 6.85it/s]\n 43%|████▎ | 13/30 [00:01<00:02, 6.85it/s]\n 47%|████▋ | 14/30 [00:02<00:02, 6.85it/s]\n 50%|█████ | 15/30 [00:02<00:02, 6.85it/s]\n 53%|█████▎ | 16/30 [00:02<00:02, 6.85it/s]\n 57%|█████▋ | 17/30 [00:02<00:01, 6.85it/s]\n 60%|██████ | 18/30 [00:02<00:01, 6.86it/s]\n 63%|██████▎ | 19/30 [00:02<00:01, 6.85it/s]\n 67%|██████▋ | 20/30 [00:02<00:01, 6.83it/s]\n 70%|███████ | 21/30 [00:03<00:01, 6.82it/s]\n 73%|███████▎ | 22/30 [00:03<00:01, 6.83it/s]\n 77%|███████▋ | 23/30 [00:03<00:01, 6.83it/s]\n 80%|████████ | 24/30 [00:03<00:00, 6.83it/s]\n 83%|████████▎ | 25/30 [00:03<00:00, 6.84it/s]\n 87%|████████▋ | 26/30 [00:03<00:00, 6.84it/s]\n 90%|█████████ | 27/30 [00:03<00:00, 6.84it/s]\n 93%|█████████▎| 28/30 [00:04<00:00, 6.84it/s]\n 97%|█████████▋| 29/30 [00:04<00:00, 6.84it/s]\n100%|██████████| 30/30 [00:04<00:00, 6.84it/s]\n100%|██████████| 30/30 [00:04<00:00, 6.83it/s]", "metrics": { "predict_time": 9.290892, "total_time": 215.593329 }, "output": "https://replicate.delivery/pbxt/Bv33JBfb3STcW6d0qA5j6kfWd4frQ2aZATKqCcvCiStuGGDlA/tmp1xpdx4m6.jpg", "started_at": "2024-03-18T17:50:06.417375Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/jsd3t4bbhf65nvrtjsbatnlv6m", "cancel": "https://api.replicate.com/v1/predictions/jsd3t4bbhf65nvrtjsbatnlv6m/cancel" }, "version": "d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338" }
Generated inModel parse in 0.29 seconds. 0: 640x480 8 persons, 11.8ms Speed: 2.2ms preprocess, 11.8ms inference, 2.4ms postprocess per image at shape (1, 3, 640, 480) Open pose in 0.02 seconds. Initial seed: 0 0%| | 0/30 [00:00<?, ?it/s] 3%|▎ | 1/30 [00:00<00:04, 6.43it/s] 7%|▋ | 2/30 [00:00<00:04, 6.71it/s] 10%|█ | 3/30 [00:00<00:03, 6.77it/s] 13%|█▎ | 4/30 [00:00<00:03, 6.81it/s] 17%|█▋ | 5/30 [00:00<00:03, 6.82it/s] 20%|██ | 6/30 [00:00<00:03, 6.83it/s] 23%|██▎ | 7/30 [00:01<00:03, 6.84it/s] 27%|██▋ | 8/30 [00:01<00:03, 6.85it/s] 30%|███ | 9/30 [00:01<00:03, 6.85it/s] 33%|███▎ | 10/30 [00:01<00:02, 6.85it/s] 37%|███▋ | 11/30 [00:01<00:02, 6.85it/s] 40%|████ | 12/30 [00:01<00:02, 6.85it/s] 43%|████▎ | 13/30 [00:01<00:02, 6.85it/s] 47%|████▋ | 14/30 [00:02<00:02, 6.85it/s] 50%|█████ | 15/30 [00:02<00:02, 6.85it/s] 53%|█████▎ | 16/30 [00:02<00:02, 6.85it/s] 57%|█████▋ | 17/30 [00:02<00:01, 6.85it/s] 60%|██████ | 18/30 [00:02<00:01, 6.86it/s] 63%|██████▎ | 19/30 [00:02<00:01, 6.85it/s] 67%|██████▋ | 20/30 [00:02<00:01, 6.83it/s] 70%|███████ | 21/30 [00:03<00:01, 6.82it/s] 73%|███████▎ | 22/30 [00:03<00:01, 6.83it/s] 77%|███████▋ | 23/30 [00:03<00:01, 6.83it/s] 80%|████████ | 24/30 [00:03<00:00, 6.83it/s] 83%|████████▎ | 25/30 [00:03<00:00, 6.84it/s] 87%|████████▋ | 26/30 [00:03<00:00, 6.84it/s] 90%|█████████ | 27/30 [00:03<00:00, 6.84it/s] 93%|█████████▎| 28/30 [00:04<00:00, 6.84it/s] 97%|█████████▋| 29/30 [00:04<00:00, 6.84it/s] 100%|██████████| 30/30 [00:04<00:00, 6.84it/s] 100%|██████████| 30/30 [00:04<00:00, 6.83it/s]
Prediction
omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338IDoaxvx6jbrmxomjundnjfv6piumStatusSucceededSourceAPIHardwareA100 (40GB)Total durationCreatedInput
{ "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/26320697-e468-4aae-866c-9e5606711056/ltx7mxlf.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/3ce50a62-343c-4a5e-bd6a-9a254e8d7d6b/ltx7me31.jpeg", "guidance_scale": 3 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run omniedgeio/virtual-dressing using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338", { input: { seed: 0, steps: 30, model_image: "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/26320697-e468-4aae-866c-9e5606711056/ltx7mxlf.jpeg", garment_image: "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/3ce50a62-343c-4a5e-bd6a-9a254e8d7d6b/ltx7me31.jpeg", guidance_scale: 3 } } ); // To access the file URL: console.log(output.url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run omniedgeio/virtual-dressing using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338", input={ "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/26320697-e468-4aae-866c-9e5606711056/ltx7mxlf.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/3ce50a62-343c-4a5e-bd6a-9a254e8d7d6b/ltx7me31.jpeg", "guidance_scale": 3 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run omniedgeio/virtual-dressing using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338", "input": { "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/26320697-e468-4aae-866c-9e5606711056/ltx7mxlf.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/3ce50a62-343c-4a5e-bd6a-9a254e8d7d6b/ltx7me31.jpeg", "guidance_scale": 3 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/omniedgeio/virtual-dressing@sha256:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338 \ -i 'seed=0' \ -i 'steps=30' \ -i 'model_image="https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/26320697-e468-4aae-866c-9e5606711056/ltx7mxlf.jpeg"' \ -i 'garment_image="https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/3ce50a62-343c-4a5e-bd6a-9a254e8d7d6b/ltx7me31.jpeg"' \ -i 'guidance_scale=3'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/omniedgeio/virtual-dressing@sha256:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/26320697-e468-4aae-866c-9e5606711056/ltx7mxlf.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/3ce50a62-343c-4a5e-bd6a-9a254e8d7d6b/ltx7me31.jpeg", "guidance_scale": 3 } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2024-03-18T17:24:41.739687Z", "created_at": "2024-03-18T17:20:11.427846Z", "data_removed": false, "error": null, "id": "oaxvx6jbrmxomjundnjfv6pium", "input": { "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/26320697-e468-4aae-866c-9e5606711056/ltx7mxlf.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/3ce50a62-343c-4a5e-bd6a-9a254e8d7d6b/ltx7me31.jpeg", "guidance_scale": 3 }, "logs": "Model parse in 0.68 seconds.\n0: 640x480 5 persons, 190.9ms\nSpeed: 8.7ms preprocess, 190.9ms inference, 574.1ms postprocess per image at shape (1, 3, 640, 480)\nOpen pose in 1.49 seconds.\nInitial seed: 0\n 0%| | 0/30 [00:00<?, ?it/s]\n 3%|▎ | 1/30 [00:00<00:05, 4.86it/s]\n 7%|▋ | 2/30 [00:00<00:05, 5.35it/s]\n 10%|█ | 3/30 [00:00<00:05, 5.39it/s]\n 13%|█▎ | 4/30 [00:00<00:04, 5.80it/s]\n 17%|█▋ | 5/30 [00:00<00:04, 6.10it/s]\n 20%|██ | 6/30 [00:01<00:03, 6.30it/s]\n 23%|██▎ | 7/30 [00:01<00:03, 6.35it/s]\n 27%|██▋ | 8/30 [00:01<00:03, 6.46it/s]\n 30%|███ | 9/30 [00:01<00:03, 6.51it/s]\n 33%|███▎ | 10/30 [00:01<00:03, 6.52it/s]\n 37%|███▋ | 11/30 [00:01<00:02, 6.60it/s]\n 40%|████ | 12/30 [00:01<00:02, 6.66it/s]\n 43%|████▎ | 13/30 [00:02<00:02, 6.61it/s]\n 47%|████▋ | 14/30 [00:02<00:02, 6.63it/s]\n 50%|█████ | 15/30 [00:02<00:02, 6.68it/s]\n 53%|█████▎ | 16/30 [00:02<00:02, 6.68it/s]\n 57%|█████▋ | 17/30 [00:02<00:01, 6.70it/s]\n 60%|██████ | 18/30 [00:02<00:01, 6.73it/s]\n 63%|██████▎ | 19/30 [00:02<00:01, 6.76it/s]\n 67%|██████▋ | 20/30 [00:03<00:01, 6.72it/s]\n 70%|███████ | 21/30 [00:03<00:01, 6.74it/s]\n 73%|███████▎ | 22/30 [00:03<00:01, 6.76it/s]\n 77%|███████▋ | 23/30 [00:03<00:01, 6.71it/s]\n 80%|████████ | 24/30 [00:03<00:00, 6.73it/s]\n 83%|████████▎ | 25/30 [00:03<00:00, 6.67it/s]\n 87%|████████▋ | 26/30 [00:04<00:00, 6.63it/s]\n 90%|█████████ | 27/30 [00:04<00:00, 6.51it/s]\n 93%|█████████▎| 28/30 [00:04<00:00, 6.56it/s]\n 97%|█████████▋| 29/30 [00:04<00:00, 6.61it/s]\n100%|██████████| 30/30 [00:04<00:00, 6.67it/s]\n100%|██████████| 30/30 [00:04<00:00, 6.50it/s]", "metrics": { "predict_time": 10.566532, "total_time": 270.311841 }, "output": "https://replicate.delivery/pbxt/DOARzIY3jj6QG5X38YhZBpJQ7AN0YjO9UxRbewfdjQqZrihSA/tmp71a4h3ie.jpg", "started_at": "2024-03-18T17:24:31.173155Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/oaxvx6jbrmxomjundnjfv6pium", "cancel": "https://api.replicate.com/v1/predictions/oaxvx6jbrmxomjundnjfv6pium/cancel" }, "version": "d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338" }
Generated inModel parse in 0.68 seconds. 0: 640x480 5 persons, 190.9ms Speed: 8.7ms preprocess, 190.9ms inference, 574.1ms postprocess per image at shape (1, 3, 640, 480) Open pose in 1.49 seconds. Initial seed: 0 0%| | 0/30 [00:00<?, ?it/s] 3%|▎ | 1/30 [00:00<00:05, 4.86it/s] 7%|▋ | 2/30 [00:00<00:05, 5.35it/s] 10%|█ | 3/30 [00:00<00:05, 5.39it/s] 13%|█▎ | 4/30 [00:00<00:04, 5.80it/s] 17%|█▋ | 5/30 [00:00<00:04, 6.10it/s] 20%|██ | 6/30 [00:01<00:03, 6.30it/s] 23%|██▎ | 7/30 [00:01<00:03, 6.35it/s] 27%|██▋ | 8/30 [00:01<00:03, 6.46it/s] 30%|███ | 9/30 [00:01<00:03, 6.51it/s] 33%|███▎ | 10/30 [00:01<00:03, 6.52it/s] 37%|███▋ | 11/30 [00:01<00:02, 6.60it/s] 40%|████ | 12/30 [00:01<00:02, 6.66it/s] 43%|████▎ | 13/30 [00:02<00:02, 6.61it/s] 47%|████▋ | 14/30 [00:02<00:02, 6.63it/s] 50%|█████ | 15/30 [00:02<00:02, 6.68it/s] 53%|█████▎ | 16/30 [00:02<00:02, 6.68it/s] 57%|█████▋ | 17/30 [00:02<00:01, 6.70it/s] 60%|██████ | 18/30 [00:02<00:01, 6.73it/s] 63%|██████▎ | 19/30 [00:02<00:01, 6.76it/s] 67%|██████▋ | 20/30 [00:03<00:01, 6.72it/s] 70%|███████ | 21/30 [00:03<00:01, 6.74it/s] 73%|███████▎ | 22/30 [00:03<00:01, 6.76it/s] 77%|███████▋ | 23/30 [00:03<00:01, 6.71it/s] 80%|████████ | 24/30 [00:03<00:00, 6.73it/s] 83%|████████▎ | 25/30 [00:03<00:00, 6.67it/s] 87%|████████▋ | 26/30 [00:04<00:00, 6.63it/s] 90%|█████████ | 27/30 [00:04<00:00, 6.51it/s] 93%|█████████▎| 28/30 [00:04<00:00, 6.56it/s] 97%|█████████▋| 29/30 [00:04<00:00, 6.61it/s] 100%|██████████| 30/30 [00:04<00:00, 6.67it/s] 100%|██████████| 30/30 [00:04<00:00, 6.50it/s]
Prediction
omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338ID6ftkw7bbu4ezeb3j3mz55omwduStatusSucceededSourceAPIHardwareA100 (40GB)Total durationCreatedInput
{ "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/90be166d-58bf-4a25-b208-5bb11f6bb60c/ltx6xa1m.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/31e5ae8a-0e35-4a65-acf4-04920f454552/ltx6xbqp.jpeg", "guidance_scale": 3 }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run omniedgeio/virtual-dressing using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338", { input: { seed: 0, steps: 30, model_image: "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/90be166d-58bf-4a25-b208-5bb11f6bb60c/ltx6xa1m.jpeg", garment_image: "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/31e5ae8a-0e35-4a65-acf4-04920f454552/ltx6xbqp.jpeg", guidance_scale: 3 } } ); // To access the file URL: console.log(output.url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run omniedgeio/virtual-dressing using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338", input={ "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/90be166d-58bf-4a25-b208-5bb11f6bb60c/ltx6xa1m.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/31e5ae8a-0e35-4a65-acf4-04920f454552/ltx6xbqp.jpeg", "guidance_scale": 3 } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Run omniedgeio/virtual-dressing using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "omniedgeio/virtual-dressing:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338", "input": { "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/90be166d-58bf-4a25-b208-5bb11f6bb60c/ltx6xa1m.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/31e5ae8a-0e35-4a65-acf4-04920f454552/ltx6xbqp.jpeg", "guidance_scale": 3 } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/omniedgeio/virtual-dressing@sha256:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338 \ -i 'seed=0' \ -i 'steps=30' \ -i 'model_image="https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/90be166d-58bf-4a25-b208-5bb11f6bb60c/ltx6xa1m.jpeg"' \ -i 'garment_image="https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/31e5ae8a-0e35-4a65-acf4-04920f454552/ltx6xbqp.jpeg"' \ -i 'guidance_scale=3'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/omniedgeio/virtual-dressing@sha256:d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/90be166d-58bf-4a25-b208-5bb11f6bb60c/ltx6xa1m.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/31e5ae8a-0e35-4a65-acf4-04920f454552/ltx6xbqp.jpeg", "guidance_scale": 3 } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2024-03-18T17:02:54.753625Z", "created_at": "2024-03-18T16:59:37.715689Z", "data_removed": false, "error": null, "id": "6ftkw7bbu4ezeb3j3mz55omwdu", "input": { "seed": 0, "steps": 30, "model_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/90be166d-58bf-4a25-b208-5bb11f6bb60c/ltx6xa1m.jpeg", "garment_image": "https://aihublol.s3.ap-northeast-1.amazonaws.com/next-s3-uploads/31e5ae8a-0e35-4a65-acf4-04920f454552/ltx6xbqp.jpeg", "guidance_scale": 3 }, "logs": "Model parse in 0.60 seconds.\n0: 640x480 1 person, 139.8ms\nSpeed: 6.9ms preprocess, 139.8ms inference, 404.0ms postprocess per image at shape (1, 3, 640, 480)\nOpen pose in 1.06 seconds.\nInitial seed: 0\n 0%| | 0/30 [00:00<?, ?it/s]\n 3%|▎ | 1/30 [00:00<00:05, 5.57it/s]\n 7%|▋ | 2/30 [00:00<00:04, 6.23it/s]\n 10%|█ | 3/30 [00:00<00:04, 6.00it/s]\n 13%|█▎ | 4/30 [00:00<00:04, 6.31it/s]\n 17%|█▋ | 5/30 [00:00<00:03, 6.51it/s]\n 20%|██ | 6/30 [00:00<00:03, 6.64it/s]\n 23%|██▎ | 7/30 [00:01<00:03, 6.72it/s]\n 27%|██▋ | 8/30 [00:01<00:03, 6.65it/s]\n 30%|███ | 9/30 [00:01<00:03, 6.72it/s]\n 33%|███▎ | 10/30 [00:01<00:02, 6.77it/s]\n 37%|███▋ | 11/30 [00:01<00:02, 6.81it/s]\n 40%|████ | 12/30 [00:01<00:02, 6.83it/s]\n 43%|████▎ | 13/30 [00:01<00:02, 6.84it/s]\n 47%|████▋ | 14/30 [00:02<00:02, 6.84it/s]\n 50%|█████ | 15/30 [00:02<00:02, 6.65it/s]\n 53%|█████▎ | 16/30 [00:02<00:02, 6.72it/s]\n 57%|█████▋ | 17/30 [00:02<00:01, 6.77it/s]\n 60%|██████ | 18/30 [00:02<00:01, 6.81it/s]\n 63%|██████▎ | 19/30 [00:02<00:01, 6.83it/s]\n 67%|██████▋ | 20/30 [00:02<00:01, 6.85it/s]\n 70%|███████ | 21/30 [00:03<00:01, 6.86it/s]\n 73%|███████▎ | 22/30 [00:03<00:01, 6.86it/s]\n 77%|███████▋ | 23/30 [00:03<00:01, 6.85it/s]\n 80%|████████ | 24/30 [00:03<00:00, 6.86it/s]\n 83%|████████▎ | 25/30 [00:03<00:00, 6.87it/s]\n 87%|████████▋ | 26/30 [00:03<00:00, 6.88it/s]\n 90%|█████████ | 27/30 [00:04<00:00, 6.88it/s]\n 93%|█████████▎| 28/30 [00:04<00:00, 6.88it/s]\n 97%|█████████▋| 29/30 [00:04<00:00, 6.85it/s]\n100%|██████████| 30/30 [00:04<00:00, 6.87it/s]\n100%|██████████| 30/30 [00:04<00:00, 6.74it/s]", "metrics": { "predict_time": 9.543379, "total_time": 197.037936 }, "output": "https://replicate.delivery/pbxt/eRTnzb2lKTXWDiCKsWs3kvFLfFCNxmOfHFs2E3buqUG9tEDlA/tmp6yy2c4uu.jpg", "started_at": "2024-03-18T17:02:45.210246Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/6ftkw7bbu4ezeb3j3mz55omwdu", "cancel": "https://api.replicate.com/v1/predictions/6ftkw7bbu4ezeb3j3mz55omwdu/cancel" }, "version": "d6835398c2147f0a09a99a9f6cb59f7f87c9f72c67078f3858fc51e7eefde338" }
Generated inModel parse in 0.60 seconds. 0: 640x480 1 person, 139.8ms Speed: 6.9ms preprocess, 139.8ms inference, 404.0ms postprocess per image at shape (1, 3, 640, 480) Open pose in 1.06 seconds. Initial seed: 0 0%| | 0/30 [00:00<?, ?it/s] 3%|▎ | 1/30 [00:00<00:05, 5.57it/s] 7%|▋ | 2/30 [00:00<00:04, 6.23it/s] 10%|█ | 3/30 [00:00<00:04, 6.00it/s] 13%|█▎ | 4/30 [00:00<00:04, 6.31it/s] 17%|█▋ | 5/30 [00:00<00:03, 6.51it/s] 20%|██ | 6/30 [00:00<00:03, 6.64it/s] 23%|██▎ | 7/30 [00:01<00:03, 6.72it/s] 27%|██▋ | 8/30 [00:01<00:03, 6.65it/s] 30%|███ | 9/30 [00:01<00:03, 6.72it/s] 33%|███▎ | 10/30 [00:01<00:02, 6.77it/s] 37%|███▋ | 11/30 [00:01<00:02, 6.81it/s] 40%|████ | 12/30 [00:01<00:02, 6.83it/s] 43%|████▎ | 13/30 [00:01<00:02, 6.84it/s] 47%|████▋ | 14/30 [00:02<00:02, 6.84it/s] 50%|█████ | 15/30 [00:02<00:02, 6.65it/s] 53%|█████▎ | 16/30 [00:02<00:02, 6.72it/s] 57%|█████▋ | 17/30 [00:02<00:01, 6.77it/s] 60%|██████ | 18/30 [00:02<00:01, 6.81it/s] 63%|██████▎ | 19/30 [00:02<00:01, 6.83it/s] 67%|██████▋ | 20/30 [00:02<00:01, 6.85it/s] 70%|███████ | 21/30 [00:03<00:01, 6.86it/s] 73%|███████▎ | 22/30 [00:03<00:01, 6.86it/s] 77%|███████▋ | 23/30 [00:03<00:01, 6.85it/s] 80%|████████ | 24/30 [00:03<00:00, 6.86it/s] 83%|████████▎ | 25/30 [00:03<00:00, 6.87it/s] 87%|████████▋ | 26/30 [00:03<00:00, 6.88it/s] 90%|█████████ | 27/30 [00:04<00:00, 6.88it/s] 93%|█████████▎| 28/30 [00:04<00:00, 6.88it/s] 97%|█████████▋| 29/30 [00:04<00:00, 6.85it/s] 100%|██████████| 30/30 [00:04<00:00, 6.87it/s] 100%|██████████| 30/30 [00:04<00:00, 6.74it/s]
Want to make some of these yourself?
Run this model