
The img2img pipeline that makes an anime-style image of a person. It uses one of sd1.5 models as a base, depth-estimation as a ControleNet and IPadapter model for face consistency.
Prediction
kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191ID9wxj8c2adsrgg0cf8zt9c1v99cStatusSucceededSourceWebHardwareA40Total durationCreatedInput
{ "input_image": "https://cdn.fishki.net/upload/post/2019/08/01/3047343/tn/d333f85d3bcfccf832e70b5e7c155077-01.jpg" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run kitaef/mytestmodel using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191", { input: { input_image: "https://cdn.fishki.net/upload/post/2019/08/01/3047343/tn/d333f85d3bcfccf832e70b5e7c155077-01.jpg" } } ); // To access the file URL: console.log(output.url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run kitaef/mytestmodel using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191", input={ "input_image": "https://cdn.fishki.net/upload/post/2019/08/01/3047343/tn/d333f85d3bcfccf832e70b5e7c155077-01.jpg" } ) # To access the file URL: print(output.url()) #=> "http://example.com" # To write the file to disk: with open("my-image.png", "wb") as file: file.write(output.read())
To learn more, take a look at the guide on getting started with Python.
Run kitaef/mytestmodel using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191", "input": { "input_image": "https://cdn.fishki.net/upload/post/2019/08/01/3047343/tn/d333f85d3bcfccf832e70b5e7c155077-01.jpg" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/kitaef/mytestmodel@sha256:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191 \ -i 'input_image="https://cdn.fishki.net/upload/post/2019/08/01/3047343/tn/d333f85d3bcfccf832e70b5e7c155077-01.jpg"'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/kitaef/mytestmodel@sha256:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "input_image": "https://cdn.fishki.net/upload/post/2019/08/01/3047343/tn/d333f85d3bcfccf832e70b5e7c155077-01.jpg" } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2024-05-05T08:54:40.024484Z", "created_at": "2024-05-05T08:52:23.278000Z", "data_removed": false, "error": null, "id": "9wxj8c2adsrgg0cf8zt9c1v99c", "input": { "input_image": "https://cdn.fishki.net/upload/post/2019/08/01/3047343/tn/d333f85d3bcfccf832e70b5e7c155077-01.jpg" }, "logs": "0%| | 0/20 [00:00<?, ?it/s]/root/.pyenv/versions/3.11.9/lib/python3.11/site-packages/torch/nn/modules/conv.py:456: UserWarning: Plan failed with a cudnnException: CUDNN_BACKEND_EXECUTION_PLAN_DESCRIPTOR: cudnnFinalize Descriptor Failed cudnn_status: CUDNN_STATUS_NOT_SUPPORTED (Triggered internally at ../aten/src/ATen/native/cudnn/Conv_v8.cpp:919.)\nreturn F.conv2d(input, weight, bias, self.stride,\n 5%|▌ | 1/20 [00:02<00:41, 2.19s/it]\n 10%|█ | 2/20 [00:02<00:20, 1.16s/it]\n 15%|█▌ | 3/20 [00:03<00:14, 1.20it/s]\n 20%|██ | 4/20 [00:03<00:10, 1.48it/s]\n 25%|██▌ | 5/20 [00:03<00:08, 1.70it/s]\n 30%|███ | 6/20 [00:04<00:07, 1.86it/s]\n 35%|███▌ | 7/20 [00:04<00:06, 1.98it/s]\n 40%|████ | 8/20 [00:05<00:05, 2.07it/s]\n 45%|████▌ | 9/20 [00:05<00:05, 2.14it/s]\n 50%|█████ | 10/20 [00:06<00:04, 2.18it/s]\n 55%|█████▌ | 11/20 [00:06<00:04, 2.21it/s]\n 60%|██████ | 12/20 [00:06<00:03, 2.24it/s]\n 65%|██████▌ | 13/20 [00:07<00:03, 2.25it/s]\n 70%|███████ | 14/20 [00:07<00:02, 2.26it/s]\n 75%|███████▌ | 15/20 [00:08<00:02, 2.27it/s]\n 80%|████████ | 16/20 [00:08<00:01, 2.28it/s]\n 85%|████████▌ | 17/20 [00:09<00:01, 2.28it/s]\n 90%|█████████ | 18/20 [00:09<00:00, 2.28it/s]\n 95%|█████████▌| 19/20 [00:10<00:00, 2.29it/s]\n100%|██████████| 20/20 [00:10<00:00, 2.29it/s]\n100%|██████████| 20/20 [00:10<00:00, 1.91it/s]", "metrics": { "predict_time": 18.179087, "total_time": 136.746484 }, "output": "https://replicate.delivery/pbxt/8oFfucYpBYQWOip5ngf6xzrY9DnC5GcIY8V1G2gMQgIPtPxSA/output.jpg", "started_at": "2024-05-05T08:54:21.845397Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/9wxj8c2adsrgg0cf8zt9c1v99c", "cancel": "https://api.replicate.com/v1/predictions/9wxj8c2adsrgg0cf8zt9c1v99c/cancel" }, "version": "a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191" }
Generated in0%| | 0/20 [00:00<?, ?it/s]/root/.pyenv/versions/3.11.9/lib/python3.11/site-packages/torch/nn/modules/conv.py:456: UserWarning: Plan failed with a cudnnException: CUDNN_BACKEND_EXECUTION_PLAN_DESCRIPTOR: cudnnFinalize Descriptor Failed cudnn_status: CUDNN_STATUS_NOT_SUPPORTED (Triggered internally at ../aten/src/ATen/native/cudnn/Conv_v8.cpp:919.) return F.conv2d(input, weight, bias, self.stride, 5%|▌ | 1/20 [00:02<00:41, 2.19s/it] 10%|█ | 2/20 [00:02<00:20, 1.16s/it] 15%|█▌ | 3/20 [00:03<00:14, 1.20it/s] 20%|██ | 4/20 [00:03<00:10, 1.48it/s] 25%|██▌ | 5/20 [00:03<00:08, 1.70it/s] 30%|███ | 6/20 [00:04<00:07, 1.86it/s] 35%|███▌ | 7/20 [00:04<00:06, 1.98it/s] 40%|████ | 8/20 [00:05<00:05, 2.07it/s] 45%|████▌ | 9/20 [00:05<00:05, 2.14it/s] 50%|█████ | 10/20 [00:06<00:04, 2.18it/s] 55%|█████▌ | 11/20 [00:06<00:04, 2.21it/s] 60%|██████ | 12/20 [00:06<00:03, 2.24it/s] 65%|██████▌ | 13/20 [00:07<00:03, 2.25it/s] 70%|███████ | 14/20 [00:07<00:02, 2.26it/s] 75%|███████▌ | 15/20 [00:08<00:02, 2.27it/s] 80%|████████ | 16/20 [00:08<00:01, 2.28it/s] 85%|████████▌ | 17/20 [00:09<00:01, 2.28it/s] 90%|█████████ | 18/20 [00:09<00:00, 2.28it/s] 95%|█████████▌| 19/20 [00:10<00:00, 2.29it/s] 100%|██████████| 20/20 [00:10<00:00, 2.29it/s] 100%|██████████| 20/20 [00:10<00:00, 1.91it/s]
Prediction
kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191ID7pxsqpes01rgp0cf8zwr57n8trStatusSucceededSourceWebHardwareA40Total durationCreatedInput
{ "input_image": "https://www.meme-arsenal.com/memes/b34efd7b163e816f620c2f8da7029d7b.jpg" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run kitaef/mytestmodel using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191", { input: { input_image: "https://www.meme-arsenal.com/memes/b34efd7b163e816f620c2f8da7029d7b.jpg" } } ); // To access the file URL: console.log(output.url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run kitaef/mytestmodel using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191", input={ "input_image": "https://www.meme-arsenal.com/memes/b34efd7b163e816f620c2f8da7029d7b.jpg" } ) # To access the file URL: print(output.url()) #=> "http://example.com" # To write the file to disk: with open("my-image.png", "wb") as file: file.write(output.read())
To learn more, take a look at the guide on getting started with Python.
Run kitaef/mytestmodel using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191", "input": { "input_image": "https://www.meme-arsenal.com/memes/b34efd7b163e816f620c2f8da7029d7b.jpg" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/kitaef/mytestmodel@sha256:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191 \ -i 'input_image="https://www.meme-arsenal.com/memes/b34efd7b163e816f620c2f8da7029d7b.jpg"'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/kitaef/mytestmodel@sha256:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "input_image": "https://www.meme-arsenal.com/memes/b34efd7b163e816f620c2f8da7029d7b.jpg" } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2024-05-05T08:59:02.463892Z", "created_at": "2024-05-05T08:58:27.456000Z", "data_removed": false, "error": null, "id": "7pxsqpes01rgp0cf8zwr57n8tr", "input": { "input_image": "https://www.meme-arsenal.com/memes/b34efd7b163e816f620c2f8da7029d7b.jpg" }, "logs": "0%| | 0/20 [00:00<?, ?it/s]/root/.pyenv/versions/3.11.9/lib/python3.11/site-packages/torch/nn/modules/conv.py:456: UserWarning: Plan failed with a cudnnException: CUDNN_BACKEND_EXECUTION_PLAN_DESCRIPTOR: cudnnFinalize Descriptor Failed cudnn_status: CUDNN_STATUS_NOT_SUPPORTED (Triggered internally at ../aten/src/ATen/native/cudnn/Conv_v8.cpp:919.)\nreturn F.conv2d(input, weight, bias, self.stride,\n 5%|▌ | 1/20 [00:02<00:45, 2.41s/it]\n 10%|█ | 2/20 [00:02<00:21, 1.17s/it]\n 15%|█▌ | 3/20 [00:03<00:13, 1.28it/s]\n 20%|██ | 4/20 [00:03<00:09, 1.69it/s]\n 25%|██▌ | 5/20 [00:03<00:07, 2.05it/s]\n 30%|███ | 6/20 [00:03<00:05, 2.35it/s]\n 35%|███▌ | 7/20 [00:04<00:05, 2.60it/s]\n 40%|████ | 8/20 [00:04<00:04, 2.79it/s]\n 45%|████▌ | 9/20 [00:04<00:03, 2.93it/s]\n 50%|█████ | 10/20 [00:05<00:03, 3.04it/s]\n 55%|█████▌ | 11/20 [00:05<00:02, 3.11it/s]\n 60%|██████ | 12/20 [00:05<00:02, 3.16it/s]\n 65%|██████▌ | 13/20 [00:06<00:02, 3.20it/s]\n 70%|███████ | 14/20 [00:06<00:01, 3.23it/s]\n 75%|███████▌ | 15/20 [00:06<00:01, 3.25it/s]\n 80%|████████ | 16/20 [00:06<00:01, 3.26it/s]\n 85%|████████▌ | 17/20 [00:07<00:00, 3.27it/s]\n 90%|█████████ | 18/20 [00:07<00:00, 3.28it/s]\n 95%|█████████▌| 19/20 [00:07<00:00, 3.28it/s]\n100%|██████████| 20/20 [00:08<00:00, 3.28it/s]\n100%|██████████| 20/20 [00:08<00:00, 2.44it/s]", "metrics": { "predict_time": 14.741133, "total_time": 35.007892 }, "output": "https://replicate.delivery/pbxt/DdyDdWFmlEafJSlP6nx0TCYaghXqwjYPXytqGw3NzfqVxPxSA/output.jpg", "started_at": "2024-05-05T08:58:47.722759Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/7pxsqpes01rgp0cf8zwr57n8tr", "cancel": "https://api.replicate.com/v1/predictions/7pxsqpes01rgp0cf8zwr57n8tr/cancel" }, "version": "a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191" }
Generated in0%| | 0/20 [00:00<?, ?it/s]/root/.pyenv/versions/3.11.9/lib/python3.11/site-packages/torch/nn/modules/conv.py:456: UserWarning: Plan failed with a cudnnException: CUDNN_BACKEND_EXECUTION_PLAN_DESCRIPTOR: cudnnFinalize Descriptor Failed cudnn_status: CUDNN_STATUS_NOT_SUPPORTED (Triggered internally at ../aten/src/ATen/native/cudnn/Conv_v8.cpp:919.) return F.conv2d(input, weight, bias, self.stride, 5%|▌ | 1/20 [00:02<00:45, 2.41s/it] 10%|█ | 2/20 [00:02<00:21, 1.17s/it] 15%|█▌ | 3/20 [00:03<00:13, 1.28it/s] 20%|██ | 4/20 [00:03<00:09, 1.69it/s] 25%|██▌ | 5/20 [00:03<00:07, 2.05it/s] 30%|███ | 6/20 [00:03<00:05, 2.35it/s] 35%|███▌ | 7/20 [00:04<00:05, 2.60it/s] 40%|████ | 8/20 [00:04<00:04, 2.79it/s] 45%|████▌ | 9/20 [00:04<00:03, 2.93it/s] 50%|█████ | 10/20 [00:05<00:03, 3.04it/s] 55%|█████▌ | 11/20 [00:05<00:02, 3.11it/s] 60%|██████ | 12/20 [00:05<00:02, 3.16it/s] 65%|██████▌ | 13/20 [00:06<00:02, 3.20it/s] 70%|███████ | 14/20 [00:06<00:01, 3.23it/s] 75%|███████▌ | 15/20 [00:06<00:01, 3.25it/s] 80%|████████ | 16/20 [00:06<00:01, 3.26it/s] 85%|████████▌ | 17/20 [00:07<00:00, 3.27it/s] 90%|█████████ | 18/20 [00:07<00:00, 3.28it/s] 95%|█████████▌| 19/20 [00:07<00:00, 3.28it/s] 100%|██████████| 20/20 [00:08<00:00, 3.28it/s] 100%|██████████| 20/20 [00:08<00:00, 2.44it/s]
Prediction
kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191IDsx2xda9qwxrgm0cf8zyt44zwc0StatusSucceededSourceWebHardwareA40Total durationCreatedInput
{ "input_image": "https://replicate.delivery/pbxt/KrhNPcgSMXCvtGcTt5YuLXvgD5IeRPRFTIJ86f2VdZp8CMMv/image.png" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run kitaef/mytestmodel using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191", { input: { input_image: "https://replicate.delivery/pbxt/KrhNPcgSMXCvtGcTt5YuLXvgD5IeRPRFTIJ86f2VdZp8CMMv/image.png" } } ); // To access the file URL: console.log(output.url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run kitaef/mytestmodel using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191", input={ "input_image": "https://replicate.delivery/pbxt/KrhNPcgSMXCvtGcTt5YuLXvgD5IeRPRFTIJ86f2VdZp8CMMv/image.png" } ) # To access the file URL: print(output.url()) #=> "http://example.com" # To write the file to disk: with open("my-image.png", "wb") as file: file.write(output.read())
To learn more, take a look at the guide on getting started with Python.
Run kitaef/mytestmodel using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191", "input": { "input_image": "https://replicate.delivery/pbxt/KrhNPcgSMXCvtGcTt5YuLXvgD5IeRPRFTIJ86f2VdZp8CMMv/image.png" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/kitaef/mytestmodel@sha256:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191 \ -i 'input_image="https://replicate.delivery/pbxt/KrhNPcgSMXCvtGcTt5YuLXvgD5IeRPRFTIJ86f2VdZp8CMMv/image.png"'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/kitaef/mytestmodel@sha256:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "input_image": "https://replicate.delivery/pbxt/KrhNPcgSMXCvtGcTt5YuLXvgD5IeRPRFTIJ86f2VdZp8CMMv/image.png" } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2024-05-05T09:02:16.802975Z", "created_at": "2024-05-05T09:02:08.359000Z", "data_removed": false, "error": null, "id": "sx2xda9qwxrgm0cf8zyt44zwc0", "input": { "input_image": "https://replicate.delivery/pbxt/KrhNPcgSMXCvtGcTt5YuLXvgD5IeRPRFTIJ86f2VdZp8CMMv/image.png" }, "logs": "0%| | 0/20 [00:00<?, ?it/s]\n 5%|▌ | 1/20 [00:01<00:26, 1.40s/it]\n 15%|█▌ | 3/20 [00:01<00:07, 2.39it/s]\n 25%|██▌ | 5/20 [00:01<00:03, 4.14it/s]\n 35%|███▌ | 7/20 [00:01<00:02, 5.87it/s]\n 45%|████▌ | 9/20 [00:01<00:01, 7.45it/s]\n 55%|█████▌ | 11/20 [00:02<00:01, 8.90it/s]\n 65%|██████▌ | 13/20 [00:02<00:00, 10.09it/s]\n 75%|███████▌ | 15/20 [00:02<00:00, 11.05it/s]\n 85%|████████▌ | 17/20 [00:02<00:00, 11.82it/s]\n 95%|█████████▌| 19/20 [00:02<00:00, 12.39it/s]\n100%|██████████| 20/20 [00:02<00:00, 7.17it/s]", "metrics": { "predict_time": 8.406478, "total_time": 8.443975 }, "output": "https://replicate.delivery/pbxt/jeNh3e2pQPhYtkCJeSUFGSoCHnUzdSxguTYRmrkEE8TwofELB/output.jpg", "started_at": "2024-05-05T09:02:08.396497Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/sx2xda9qwxrgm0cf8zyt44zwc0", "cancel": "https://api.replicate.com/v1/predictions/sx2xda9qwxrgm0cf8zyt44zwc0/cancel" }, "version": "a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191" }
Generated in0%| | 0/20 [00:00<?, ?it/s] 5%|▌ | 1/20 [00:01<00:26, 1.40s/it] 15%|█▌ | 3/20 [00:01<00:07, 2.39it/s] 25%|██▌ | 5/20 [00:01<00:03, 4.14it/s] 35%|███▌ | 7/20 [00:01<00:02, 5.87it/s] 45%|████▌ | 9/20 [00:01<00:01, 7.45it/s] 55%|█████▌ | 11/20 [00:02<00:01, 8.90it/s] 65%|██████▌ | 13/20 [00:02<00:00, 10.09it/s] 75%|███████▌ | 15/20 [00:02<00:00, 11.05it/s] 85%|████████▌ | 17/20 [00:02<00:00, 11.82it/s] 95%|█████████▌| 19/20 [00:02<00:00, 12.39it/s] 100%|██████████| 20/20 [00:02<00:00, 7.17it/s]
Prediction
kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191IDq81gs02n21rgj0cf905trq4hw8StatusSucceededSourceWebHardwareA40Total durationCreatedInput
{ "input_image": "https://glavcom.ua/img/article/5431/70_main.jpg" }
Install Replicate’s Node.js client library:npm install replicate
Import and set up the client:import Replicate from "replicate"; import fs from "node:fs"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run kitaef/mytestmodel using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191", { input: { input_image: "https://glavcom.ua/img/article/5431/70_main.jpg" } } ); // To access the file URL: console.log(output.url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Import the client:import replicate
Run kitaef/mytestmodel using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191", input={ "input_image": "https://glavcom.ua/img/article/5431/70_main.jpg" } ) # To access the file URL: print(output.url()) #=> "http://example.com" # To write the file to disk: with open("my-image.png", "wb") as file: file.write(output.read())
To learn more, take a look at the guide on getting started with Python.
Run kitaef/mytestmodel using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "kitaef/mytestmodel:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191", "input": { "input_image": "https://glavcom.ua/img/article/5431/70_main.jpg" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/kitaef/mytestmodel@sha256:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191 \ -i 'input_image="https://glavcom.ua/img/article/5431/70_main.jpg"'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/kitaef/mytestmodel@sha256:a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "input_image": "https://glavcom.ua/img/article/5431/70_main.jpg" } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{ "completed_at": "2024-05-05T09:18:06.613822Z", "created_at": "2024-05-05T09:17:33.328000Z", "data_removed": false, "error": null, "id": "q81gs02n21rgj0cf905trq4hw8", "input": { "input_image": "https://glavcom.ua/img/article/5431/70_main.jpg" }, "logs": "0%| | 0/20 [00:00<?, ?it/s]/root/.pyenv/versions/3.11.9/lib/python3.11/site-packages/torch/nn/modules/conv.py:456: UserWarning: Plan failed with a cudnnException: CUDNN_BACKEND_EXECUTION_PLAN_DESCRIPTOR: cudnnFinalize Descriptor Failed cudnn_status: CUDNN_STATUS_NOT_SUPPORTED (Triggered internally at ../aten/src/ATen/native/cudnn/Conv_v8.cpp:919.)\nreturn F.conv2d(input, weight, bias, self.stride,\n 5%|▌ | 1/20 [00:02<00:42, 2.26s/it]\n 10%|█ | 2/20 [00:02<00:19, 1.08s/it]\n 15%|█▌ | 3/20 [00:02<00:12, 1.40it/s]\n 20%|██ | 4/20 [00:03<00:08, 1.88it/s]\n 25%|██▌ | 5/20 [00:03<00:06, 2.30it/s]\n 30%|███ | 6/20 [00:03<00:05, 2.67it/s]\n 35%|███▌ | 7/20 [00:03<00:04, 2.97it/s]\n 40%|████ | 8/20 [00:04<00:03, 3.22it/s]\n 45%|████▌ | 9/20 [00:04<00:03, 3.40it/s]\n 50%|█████ | 10/20 [00:04<00:02, 3.54it/s]\n 55%|█████▌ | 11/20 [00:04<00:02, 3.64it/s]\n 60%|██████ | 12/20 [00:05<00:02, 3.71it/s]\n 65%|██████▌ | 13/20 [00:05<00:01, 3.75it/s]\n 70%|███████ | 14/20 [00:05<00:01, 3.79it/s]\n 75%|███████▌ | 15/20 [00:05<00:01, 3.82it/s]\n 80%|████████ | 16/20 [00:06<00:01, 3.84it/s]\n 85%|████████▌ | 17/20 [00:06<00:00, 3.86it/s]\n 90%|█████████ | 18/20 [00:06<00:00, 3.87it/s]\n 95%|█████████▌| 19/20 [00:06<00:00, 3.87it/s]\n100%|██████████| 20/20 [00:07<00:00, 3.87it/s]\n100%|██████████| 20/20 [00:07<00:00, 2.79it/s]", "metrics": { "predict_time": 14.339265, "total_time": 33.285822 }, "output": "https://replicate.delivery/pbxt/2uYjnUBfzvTIZq622fJkjQembhq6931fVrykda7xpN63MAFLB/output.jpg", "started_at": "2024-05-05T09:17:52.274557Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/q81gs02n21rgj0cf905trq4hw8", "cancel": "https://api.replicate.com/v1/predictions/q81gs02n21rgj0cf905trq4hw8/cancel" }, "version": "a0ebe82aad0744fbc8f6964143760ed306af3864daa4b62a776b656636c1f191" }
Generated in0%| | 0/20 [00:00<?, ?it/s]/root/.pyenv/versions/3.11.9/lib/python3.11/site-packages/torch/nn/modules/conv.py:456: UserWarning: Plan failed with a cudnnException: CUDNN_BACKEND_EXECUTION_PLAN_DESCRIPTOR: cudnnFinalize Descriptor Failed cudnn_status: CUDNN_STATUS_NOT_SUPPORTED (Triggered internally at ../aten/src/ATen/native/cudnn/Conv_v8.cpp:919.) return F.conv2d(input, weight, bias, self.stride, 5%|▌ | 1/20 [00:02<00:42, 2.26s/it] 10%|█ | 2/20 [00:02<00:19, 1.08s/it] 15%|█▌ | 3/20 [00:02<00:12, 1.40it/s] 20%|██ | 4/20 [00:03<00:08, 1.88it/s] 25%|██▌ | 5/20 [00:03<00:06, 2.30it/s] 30%|███ | 6/20 [00:03<00:05, 2.67it/s] 35%|███▌ | 7/20 [00:03<00:04, 2.97it/s] 40%|████ | 8/20 [00:04<00:03, 3.22it/s] 45%|████▌ | 9/20 [00:04<00:03, 3.40it/s] 50%|█████ | 10/20 [00:04<00:02, 3.54it/s] 55%|█████▌ | 11/20 [00:04<00:02, 3.64it/s] 60%|██████ | 12/20 [00:05<00:02, 3.71it/s] 65%|██████▌ | 13/20 [00:05<00:01, 3.75it/s] 70%|███████ | 14/20 [00:05<00:01, 3.79it/s] 75%|███████▌ | 15/20 [00:05<00:01, 3.82it/s] 80%|████████ | 16/20 [00:06<00:01, 3.84it/s] 85%|████████▌ | 17/20 [00:06<00:00, 3.86it/s] 90%|█████████ | 18/20 [00:06<00:00, 3.87it/s] 95%|█████████▌| 19/20 [00:06<00:00, 3.87it/s] 100%|██████████| 20/20 [00:07<00:00, 3.87it/s] 100%|██████████| 20/20 [00:07<00:00, 2.79it/s]
Want to make some of these yourself?
Run this model