Prediction
replicate/flan-t5-xl:1457f256622cd45415aa70c02105a917b39dc96e58601d7c2df5a30d2e3092e7Input
- top_p
- "0.95"
- prompt
- Here are some phrases and sentiments. \nI love this spaghetti: Positive \nThe air outside is great: Positive \nYou hated the cat in the hat: Negative \nHe is in a foul mood: Negative \nThe dinner was horrible:
- max_length
- 50
- temperature
- "0.7"
- repetition_penalty
- 1
{
"top_p": "0.95",
"prompt": "Here are some phrases and sentiments. \\nI love this spaghetti: Positive \\nThe air outside is great: Positive \\nYou hated the cat in the hat: Negative \\nHe is in a foul mood: Negative \\nThe dinner was horrible: ",
"max_length": 50,
"temperature": "0.7",
"repetition_penalty": 1
}
npm install replicate
REPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
import Replicate from "replicate";
const replicate = new Replicate({
auth: process.env.REPLICATE_API_TOKEN,
});
Run replicate/flan-t5-xl using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run(
"replicate/flan-t5-xl:1457f256622cd45415aa70c02105a917b39dc96e58601d7c2df5a30d2e3092e7",
{
input: {
top_p: "0.95",
prompt: "Here are some phrases and sentiments. \\nI love this spaghetti: Positive \\nThe air outside is great: Positive \\nYou hated the cat in the hat: Negative \\nHe is in a foul mood: Negative \\nThe dinner was horrible: ",
max_length: 50,
temperature: "0.7",
repetition_penalty: 1
}
}
);
console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
pip install replicate
REPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
import replicate
Run replicate/flan-t5-xl using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run(
"replicate/flan-t5-xl:1457f256622cd45415aa70c02105a917b39dc96e58601d7c2df5a30d2e3092e7",
input={
"top_p": "0.95",
"prompt": "Here are some phrases and sentiments. \\nI love this spaghetti: Positive \\nThe air outside is great: Positive \\nYou hated the cat in the hat: Negative \\nHe is in a foul mood: Negative \\nThe dinner was horrible: ",
"max_length": 50,
"temperature": "0.7",
"repetition_penalty": 1
}
)
print(output)
To learn more, take a look at the guide on getting started with Python.
REPLICATE_API_TOKEN
environment variable:export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Run replicate/flan-t5-xl using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \
-H "Authorization: Bearer $REPLICATE_API_TOKEN" \
-H "Content-Type: application/json" \
-H "Prefer: wait" \
-d $'{
"version": "1457f256622cd45415aa70c02105a917b39dc96e58601d7c2df5a30d2e3092e7",
"input": {
"top_p": "0.95",
"prompt": "Here are some phrases and sentiments. \\\\nI love this spaghetti: Positive \\\\nThe air outside is great: Positive \\\\nYou hated the cat in the hat: Negative \\\\nHe is in a foul mood: Negative \\\\nThe dinner was horrible: ",
"max_length": 50,
"temperature": "0.7",
"repetition_penalty": 1
}
}' \
https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
brew install cog
If you don’t have Homebrew, there are other installation options available.
Run this to download the model and run it in your local environment:
cog predict r8.im/replicate/flan-t5-xl@sha256:1457f256622cd45415aa70c02105a917b39dc96e58601d7c2df5a30d2e3092e7 \
-i 'top_p="0.95"' \
-i $'prompt="Here are some phrases and sentiments. \\\\nI love this spaghetti: Positive \\\\nThe air outside is great: Positive \\\\nYou hated the cat in the hat: Negative \\\\nHe is in a foul mood: Negative \\\\nThe dinner was horrible: "' \
-i 'max_length=50' \
-i 'temperature="0.7"' \
-i 'repetition_penalty=1'
To learn more, take a look at the Cog documentation.
Run this to download the model and run it in your local environment:
docker run -d -p 5000:5000 --gpus=all r8.im/replicate/flan-t5-xl@sha256:1457f256622cd45415aa70c02105a917b39dc96e58601d7c2df5a30d2e3092e7
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "top_p": "0.95", "prompt": "Here are some phrases and sentiments. \\\\nI love this spaghetti: Positive \\\\nThe air outside is great: Positive \\\\nYou hated the cat in the hat: Negative \\\\nHe is in a foul mood: Negative \\\\nThe dinner was horrible: ", "max_length": 50, "temperature": "0.7", "repetition_penalty": 1 } }' \ http://localhost:5000/predictions
To learn more, take a look at the Cog documentation.
Output
{
"completed_at": "2023-03-03T06:31:14.781367Z",
"created_at": "2023-03-03T06:31:14.465594Z",
"data_removed": false,
"error": null,
"id": "oi3gyjmgzbd37le4d3otrjdjyq",
"input": {
"top_p": "0.95",
"prompt": "Here are some phrases and sentiments. \\nI love this spaghetti: Positive \\nThe air outside is great: Positive \\nYou hated the cat in the hat: Negative \\nHe is in a foul mood: Negative \\nThe dinner was horrible: ",
"max_length": 50,
"temperature": "0.7",
"repetition_penalty": 1
},
"logs": null,
"metrics": {
"predict_time": 0.219377,
"total_time": 0.315773
},
"output": [
"Negative"
],
"started_at": "2023-03-03T06:31:14.561990Z",
"status": "succeeded",
"urls": {
"get": "https://api.replicate.com/v1/predictions/oi3gyjmgzbd37le4d3otrjdjyq",
"cancel": "https://api.replicate.com/v1/predictions/oi3gyjmgzbd37le4d3otrjdjyq/cancel"
},
"version": "1457f256622cd45415aa70c02105a917b39dc96e58601d7c2df5a30d2e3092e7"
}