lucataco
/
qwen-vl-chat
A multimodal LLM-based AI assistant, which is trained with alignment techniques. Qwen-VL-Chat supports more flexible interaction, such as multi-round question answering, and creative capabilities.
Prediction
lucataco/qwen-vl-chat:50881b15IDwjidtf3bmjkk2ont6bbr2jtlr4StatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
{ "image": "https://replicate.delivery/pbxt/JSwoXkHNfqkjHpPylaKkIK444Tln22PDu0hoywWewUUVqrGg/Rebecca_%281939_poster%29_Small.jpeg", "prompt": "What is the name of the movie in the poster?" }
Install Replicate’s Node.js client library:npm install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run lucataco/qwen-vl-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "lucataco/qwen-vl-chat:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9", { input: { image: "https://replicate.delivery/pbxt/JSwoXkHNfqkjHpPylaKkIK444Tln22PDu0hoywWewUUVqrGg/Rebecca_%281939_poster%29_Small.jpeg", prompt: "What is the name of the movie in the poster?" } } ); console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import the client:import replicate
Run lucataco/qwen-vl-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "lucataco/qwen-vl-chat:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9", input={ "image": "https://replicate.delivery/pbxt/JSwoXkHNfqkjHpPylaKkIK444Tln22PDu0hoywWewUUVqrGg/Rebecca_%281939_poster%29_Small.jpeg", "prompt": "What is the name of the movie in the poster?" } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Run lucataco/qwen-vl-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9", "input": { "image": "https://replicate.delivery/pbxt/JSwoXkHNfqkjHpPylaKkIK444Tln22PDu0hoywWewUUVqrGg/Rebecca_%281939_poster%29_Small.jpeg", "prompt": "What is the name of the movie in the poster?" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/lucataco/qwen-vl-chat@sha256:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9 \ -i 'image="https://replicate.delivery/pbxt/JSwoXkHNfqkjHpPylaKkIK444Tln22PDu0hoywWewUUVqrGg/Rebecca_%281939_poster%29_Small.jpeg"' \ -i 'prompt="What is the name of the movie in the poster?"'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/lucataco/qwen-vl-chat@sha256:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "image": "https://replicate.delivery/pbxt/JSwoXkHNfqkjHpPylaKkIK444Tln22PDu0hoywWewUUVqrGg/Rebecca_%281939_poster%29_Small.jpeg", "prompt": "What is the name of the movie in the poster?" } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
The name of the movie in the poster is "Rebecca."{ "completed_at": "2023-09-03T23:31:35.912904Z", "created_at": "2023-09-03T23:27:25.324801Z", "data_removed": false, "error": null, "id": "wjidtf3bmjkk2ont6bbr2jtlr4", "input": { "image": "https://replicate.delivery/pbxt/JSwoXkHNfqkjHpPylaKkIK444Tln22PDu0hoywWewUUVqrGg/Rebecca_%281939_poster%29_Small.jpeg", "prompt": "What is the name of the movie in the poster?" }, "logs": null, "metrics": { "predict_time": 2.744332, "total_time": 250.588103 }, "output": "The name of the movie in the poster is \"Rebecca.\"", "started_at": "2023-09-03T23:31:33.168572Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/wjidtf3bmjkk2ont6bbr2jtlr4", "cancel": "https://api.replicate.com/v1/predictions/wjidtf3bmjkk2ont6bbr2jtlr4/cancel" }, "version": "89b7306278504236eb79995d38bbe342e09f2a405317a4c10f06095bc4faf2ec" }
Generated inPrediction
lucataco/qwen-vl-chat:50881b15IDqonytclbgqdaaymryjzbi4usraStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
{ "image": "https://replicate.delivery/pbxt/JSxXndVb7NFIhfhwQvzzBlWAqDsorRKW755I50IYfbuOpZB4/Screenshot%202023-09-03%20at%208.09.47%20PM.png", "prompt": "What is in the picture?" }
Install Replicate’s Node.js client library:npm install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run lucataco/qwen-vl-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "lucataco/qwen-vl-chat:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9", { input: { image: "https://replicate.delivery/pbxt/JSxXndVb7NFIhfhwQvzzBlWAqDsorRKW755I50IYfbuOpZB4/Screenshot%202023-09-03%20at%208.09.47%20PM.png", prompt: "What is in the picture?" } } ); console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import the client:import replicate
Run lucataco/qwen-vl-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "lucataco/qwen-vl-chat:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9", input={ "image": "https://replicate.delivery/pbxt/JSxXndVb7NFIhfhwQvzzBlWAqDsorRKW755I50IYfbuOpZB4/Screenshot%202023-09-03%20at%208.09.47%20PM.png", "prompt": "What is in the picture?" } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Run lucataco/qwen-vl-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9", "input": { "image": "https://replicate.delivery/pbxt/JSxXndVb7NFIhfhwQvzzBlWAqDsorRKW755I50IYfbuOpZB4/Screenshot%202023-09-03%20at%208.09.47%20PM.png", "prompt": "What is in the picture?" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/lucataco/qwen-vl-chat@sha256:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9 \ -i 'image="https://replicate.delivery/pbxt/JSxXndVb7NFIhfhwQvzzBlWAqDsorRKW755I50IYfbuOpZB4/Screenshot%202023-09-03%20at%208.09.47%20PM.png"' \ -i 'prompt="What is in the picture?"'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/lucataco/qwen-vl-chat@sha256:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "image": "https://replicate.delivery/pbxt/JSxXndVb7NFIhfhwQvzzBlWAqDsorRKW755I50IYfbuOpZB4/Screenshot%202023-09-03%20at%208.09.47%20PM.png", "prompt": "What is in the picture?" } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
The picture features the United States Capitol building, which is a well-known landmark in Washington, D.C. The Capitol is a large, white, neoclassical dome with a green lawn in front of it, creating a picturesque scene.{ "completed_at": "2023-09-04T00:15:14.534705Z", "created_at": "2023-09-04T00:15:11.616074Z", "data_removed": false, "error": null, "id": "qonytclbgqdaaymryjzbi4usra", "input": { "image": "https://replicate.delivery/pbxt/JSxXndVb7NFIhfhwQvzzBlWAqDsorRKW755I50IYfbuOpZB4/Screenshot%202023-09-03%20at%208.09.47%20PM.png", "prompt": "What is in the picture?" }, "logs": null, "metrics": { "predict_time": 2.89977, "total_time": 2.918631 }, "output": "The picture features the United States Capitol building, which is a well-known landmark in Washington, D.C. The Capitol is a large, white, neoclassical dome with a green lawn in front of it, creating a picturesque scene.", "started_at": "2023-09-04T00:15:11.634935Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/qonytclbgqdaaymryjzbi4usra", "cancel": "https://api.replicate.com/v1/predictions/qonytclbgqdaaymryjzbi4usra/cancel" }, "version": "89b7306278504236eb79995d38bbe342e09f2a405317a4c10f06095bc4faf2ec" }
Generated inPrediction
lucataco/qwen-vl-chat:50881b15IDlhz5de3bpvpkfzdeqhtfgtlrzaStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
{ "image": "https://replicate.delivery/pbxt/JSxaBIYFb6n4re7KYwX8tWoL4MRazHnbS5fYSrnQUKj9IeNV/Shanghai_Small.jpeg", "prompt": "图里有啥" }
Install Replicate’s Node.js client library:npm install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run lucataco/qwen-vl-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "lucataco/qwen-vl-chat:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9", { input: { image: "https://replicate.delivery/pbxt/JSxaBIYFb6n4re7KYwX8tWoL4MRazHnbS5fYSrnQUKj9IeNV/Shanghai_Small.jpeg", prompt: "图里有啥" } } ); console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import the client:import replicate
Run lucataco/qwen-vl-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "lucataco/qwen-vl-chat:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9", input={ "image": "https://replicate.delivery/pbxt/JSxaBIYFb6n4re7KYwX8tWoL4MRazHnbS5fYSrnQUKj9IeNV/Shanghai_Small.jpeg", "prompt": "图里有啥" } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Run lucataco/qwen-vl-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9", "input": { "image": "https://replicate.delivery/pbxt/JSxaBIYFb6n4re7KYwX8tWoL4MRazHnbS5fYSrnQUKj9IeNV/Shanghai_Small.jpeg", "prompt": "图里有啥" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/lucataco/qwen-vl-chat@sha256:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9 \ -i 'image="https://replicate.delivery/pbxt/JSxaBIYFb6n4re7KYwX8tWoL4MRazHnbS5fYSrnQUKj9IeNV/Shanghai_Small.jpeg"' \ -i 'prompt="图里有啥"'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/lucataco/qwen-vl-chat@sha256:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "image": "https://replicate.delivery/pbxt/JSxaBIYFb6n4re7KYwX8tWoL4MRazHnbS5fYSrnQUKj9IeNV/Shanghai_Small.jpeg", "prompt": "图里有啥" } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
图中鸟儿展翅飞翔,背景是上海的高楼大厦,展现着这个国际大都市的繁华与美丽。{ "completed_at": "2023-09-04T00:17:42.198478Z", "created_at": "2023-09-04T00:17:40.572681Z", "data_removed": false, "error": null, "id": "lhz5de3bpvpkfzdeqhtfgtlrza", "input": { "image": "https://replicate.delivery/pbxt/JSxaBIYFb6n4re7KYwX8tWoL4MRazHnbS5fYSrnQUKj9IeNV/Shanghai_Small.jpeg", "prompt": "图里有啥" }, "logs": null, "metrics": { "predict_time": 1.62323, "total_time": 1.625797 }, "output": "图中鸟儿展翅飞翔,背景是上海的高楼大厦,展现着这个国际大都市的繁华与美丽。", "started_at": "2023-09-04T00:17:40.575248Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/lhz5de3bpvpkfzdeqhtfgtlrza", "cancel": "https://api.replicate.com/v1/predictions/lhz5de3bpvpkfzdeqhtfgtlrza/cancel" }, "version": "89b7306278504236eb79995d38bbe342e09f2a405317a4c10f06095bc4faf2ec" }
Generated inPrediction
lucataco/qwen-vl-chat:50881b15ID8g8a2cmkpdrm80ckkq8vxq231wStatusSucceededSourceWebHardwareL40STotal durationCreatedInput
- prompt
- How much would I pay if I want to order two Salmon Burger and three Meat Lover\'s Pizza? Think carefully step by step.
{ "image": "https://replicate.delivery/pbxt/JSwt0WCMKtolbjYYo6WYIE01Iemz3etQD6ugKxxeiVVlMgjF/Menu.jpeg", "prompt": "How much would I pay if I want to order two Salmon Burger and three Meat Lover\\'s Pizza? Think carefully step by step." }
Install Replicate’s Node.js client library:npm install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import and set up the client:import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run lucataco/qwen-vl-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "lucataco/qwen-vl-chat:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9", { input: { image: "https://replicate.delivery/pbxt/JSwt0WCMKtolbjYYo6WYIE01Iemz3etQD6ugKxxeiVVlMgjF/Menu.jpeg", prompt: "How much would I pay if I want to order two Salmon Burger and three Meat Lover\\'s Pizza? Think carefully step by step." } } ); console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
Install Replicate’s Python client library:pip install replicate
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Import the client:import replicate
Run lucataco/qwen-vl-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run( "lucataco/qwen-vl-chat:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9", input={ "image": "https://replicate.delivery/pbxt/JSwt0WCMKtolbjYYo6WYIE01Iemz3etQD6ugKxxeiVVlMgjF/Menu.jpeg", "prompt": "How much would I pay if I want to order two Salmon Burger and three Meat Lover\\'s Pizza? Think carefully step by step." } ) print(output)
To learn more, take a look at the guide on getting started with Python.
Set theREPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Run lucataco/qwen-vl-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9", "input": { "image": "https://replicate.delivery/pbxt/JSwt0WCMKtolbjYYo6WYIE01Iemz3etQD6ugKxxeiVVlMgjF/Menu.jpeg", "prompt": "How much would I pay if I want to order two Salmon Burger and three Meat Lover\\\\\'s Pizza? Think carefully step by step." } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
You can run this model locally using Cog. First, install Cog:brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/lucataco/qwen-vl-chat@sha256:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9 \ -i 'image="https://replicate.delivery/pbxt/JSwt0WCMKtolbjYYo6WYIE01Iemz3etQD6ugKxxeiVVlMgjF/Menu.jpeg"' \ -i $'prompt="How much would I pay if I want to order two Salmon Burger and three Meat Lover\\\\\'s Pizza? Think carefully step by step."'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/lucataco/qwen-vl-chat@sha256:50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "image": "https://replicate.delivery/pbxt/JSwt0WCMKtolbjYYo6WYIE01Iemz3etQD6ugKxxeiVVlMgjF/Menu.jpeg", "prompt": "How much would I pay if I want to order two Salmon Burger and three Meat Lover\\\\\'s Pizza? Think carefully step by step." } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
If you want to order two Salmon Burgers and three Meat Lover's Pizzas, the total cost would depend on the price of each item on the menu. Let's assume that the price of a Salmon Burger is $10 and the price of a Meat Lover's Pizza is $12. In this case, the total cost for two Salmon Burgers would be $20 and the total cost for three Meat Lover's Pizzas would be $36. So, the total cost for two Salmon Burgers and three Meat Lover's Pizzas would be $56.{ "completed_at": "2024-12-06T21:13:01.333534Z", "created_at": "2024-12-06T21:12:58.035000Z", "data_removed": false, "error": null, "id": "8g8a2cmkpdrm80ckkq8vxq231w", "input": { "image": "https://replicate.delivery/pbxt/JSwt0WCMKtolbjYYo6WYIE01Iemz3etQD6ugKxxeiVVlMgjF/Menu.jpeg", "prompt": "How much would I pay if I want to order two Salmon Burger and three Meat Lover\\'s Pizza? Think carefully step by step." }, "logs": null, "metrics": { "predict_time": 3.290710716, "total_time": 3.298534 }, "output": "If you want to order two Salmon Burgers and three Meat Lover's Pizzas, the total cost would depend on the price of each item on the menu. \n\nLet's assume that the price of a Salmon Burger is $10 and the price of a Meat Lover's Pizza is $12. In this case, the total cost for two Salmon Burgers would be $20 and the total cost for three Meat Lover's Pizzas would be $36.\n\nSo, the total cost for two Salmon Burgers and three Meat Lover's Pizzas would be $56.", "started_at": "2024-12-06T21:12:58.042823Z", "status": "succeeded", "urls": { "get": "https://api.replicate.com/v1/predictions/8g8a2cmkpdrm80ckkq8vxq231w", "cancel": "https://api.replicate.com/v1/predictions/8g8a2cmkpdrm80ckkq8vxq231w/cancel" }, "version": "50881b153b4d5f72b3db697e2bbad23bb1277ab741c5b52d80cd6ee17ea660e9" }
Generated in
Want to make some of these yourself?
Run this model