typetext
{
"debug": false,
"max_new_tokens": 500,
"min_new_tokens": -1,
"prompt": "Given that \nIllya may get radbase when Noora gets radbase.\nIllya and Noora gets radbase when Marsons causes radbase.\nAnd that \nIllya got radbase;\nThink logically if we can deduce for certain if Noora will get radbase",
"seed": 333,
"system_prompt": "You are a causal reasoning agent",
"temperature": 0.75,
"top_k": 50,
"top_p": 1
}npm install replicate
REPLICATE_API_TOKEN environment variable:export REPLICATE_API_TOKEN=r8_3EP**********************************
This is your API token. Keep it to yourself.
import Replicate from "replicate";
const replicate = new Replicate({
auth: process.env.REPLICATE_API_TOKEN,
});
Run meta/llama-2-13b-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run(
"meta/llama-2-13b-chat:de18b8b68ef78f4f52c87eb7e3a0244d18b45b3c67affef2d5055ddc9c2fb678",
{
input: {
debug: false,
max_new_tokens: 500,
min_new_tokens: -1,
prompt: "Given that \nIllya may get radbase when Noora gets radbase.\nIllya and Noora gets radbase when Marsons causes radbase.\nAnd that \nIllya got radbase;\nThink logically if we can deduce for certain if Noora will get radbase",
seed: 333,
system_prompt: "You are a causal reasoning agent",
temperature: 0.75,
top_k: 50,
top_p: 1
}
}
);
console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
pip install replicate
REPLICATE_API_TOKEN environment variable:export REPLICATE_API_TOKEN=r8_3EP**********************************
This is your API token. Keep it to yourself.
import replicate
Run meta/llama-2-13b-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run(
"meta/llama-2-13b-chat:de18b8b68ef78f4f52c87eb7e3a0244d18b45b3c67affef2d5055ddc9c2fb678",
input={
"debug": False,
"max_new_tokens": 500,
"min_new_tokens": -1,
"prompt": "Given that \nIllya may get radbase when Noora gets radbase.\nIllya and Noora gets radbase when Marsons causes radbase.\nAnd that \nIllya got radbase;\nThink logically if we can deduce for certain if Noora will get radbase",
"seed": 333,
"system_prompt": "You are a causal reasoning agent",
"temperature": 0.75,
"top_k": 50,
"top_p": 1
}
)
# The meta/llama-2-13b-chat model can stream output as it's running.
# The predict method returns an iterator, and you can iterate over that output.
for item in output:
# https://replicate.com/meta/llama-2-13b-chat/api#output-schema
print(item, end="")
To learn more, take a look at the guide on getting started with Python.
REPLICATE_API_TOKEN environment variable:export REPLICATE_API_TOKEN=r8_3EP**********************************
This is your API token. Keep it to yourself.
Run meta/llama-2-13b-chat using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \
-H "Authorization: Bearer $REPLICATE_API_TOKEN" \
-H "Content-Type: application/json" \
-H "Prefer: wait" \
-d $'{
"version": "meta/llama-2-13b-chat:de18b8b68ef78f4f52c87eb7e3a0244d18b45b3c67affef2d5055ddc9c2fb678",
"input": {
"debug": false,
"max_new_tokens": 500,
"min_new_tokens": -1,
"prompt": "Given that \\nIllya may get radbase when Noora gets radbase.\\nIllya and Noora gets radbase when Marsons causes radbase.\\nAnd that \\nIllya got radbase;\\nThink logically if we can deduce for certain if Noora will get radbase",
"seed": 333,
"system_prompt": "You are a causal reasoning agent",
"temperature": 0.75,
"top_k": 50,
"top_p": 1
}
}' \
https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
As a causal reasoning agent, I can deduce with certainty that Noora will get radbase based on the information provided: 1. Illya may get radbase when Noora gets radbase. (This implies that there is a causal relationship between Illya and Noora's radbase status.) 2. Illya and Noora get radbase when Marsons causes radbase. (This indicates that Marsons is the cause of their radbase status.) 3. Illya got radbase. (This is a known fact.) Using logical inference, we can conclude that since Illya got radbase and Marsons caused it, then Noora will also get radbase because of the causal relationship established in point 1. Therefore, it is certain that Noora will get radbase.
{
"id": "7hfon4dbjldslyor7p6ark6sae",
"model": "meta/llama-2-13b-chat",
"version": "de18b8b68ef78f4f52c87eb7e3a0244d18b45b3c67affef2d5055ddc9c2fb678",
"input": {
"debug": false,
"max_new_tokens": 500,
"min_new_tokens": -1,
"prompt": "Given that \nIllya may get radbase when Noora gets radbase.\nIllya and Noora gets radbase when Marsons causes radbase.\nAnd that \nIllya got radbase;\nThink logically if we can deduce for certain if Noora will get radbase",
"seed": 333,
"system_prompt": "You are a causal reasoning agent",
"temperature": 0.75,
"top_k": 50,
"top_p": 1
},
"logs": "Your formatted prompt is:\n[INST] <<SYS>>\nYou are a causal reasoning agent\n<</SYS>>\nGiven that\nIllya may get radbase when Noora gets radbase.\nIllya and Noora gets radbase when Marsons causes radbase.\nAnd that\nIllya got radbase;\nThink logically if we can deduce for certain if Noora will get radbase [/INST]\nNot using LoRA\nSetting seed to 333\nhostname: model-de18b8b6-1fe8e5a647c745e6-gpu-a40-5b975b6f9d-g64rb",
"output": [
"",
" As",
" a",
" caus",
"al",
" reasoning",
" agent",
",",
" I",
" can",
" dedu",
"ce",
" with",
" certain",
"ty",
" that",
" No",
"ora",
" will",
" get",
" rad",
"base",
" based",
" on",
" the",
" information",
" provided",
":",
"\n",
"\n",
"1",
".",
" Il",
"ly",
"a",
" may",
" get",
" rad",
"base",
" when",
" No",
"ora",
" gets",
" rad",
"base",
".",
" (",
"This",
" implies",
" that",
" there",
" is",
" a",
" caus",
"al",
" relationship",
" between",
" Il",
"ly",
"a",
" and",
" No",
"ora",
"'",
"s",
" rad",
"base",
" status",
".)",
"\n",
"2",
".",
" Il",
"ly",
"a",
" and",
" No",
"ora",
" get",
" rad",
"base",
" when",
" Mars",
"ons",
" causes",
" rad",
"base",
".",
" (",
"This",
" indicates",
" that",
" Mars",
"ons",
" is",
" the",
" cause",
" of",
" their",
" rad",
"base",
" status",
".)",
"\n",
"3",
".",
" Il",
"ly",
"a",
" got",
" rad",
"base",
".",
" (",
"This",
" is",
" a",
" known",
" fact",
".)",
"\n",
"\n",
"Using",
" logical",
" inference",
",",
" we",
" can",
" conclude",
" that",
" since",
" Il",
"ly",
"a",
" got",
" rad",
"base",
" and",
" Mars",
"ons",
" caused",
" it",
",",
" then",
" No",
"ora",
" will",
" also",
" get",
" rad",
"base",
" because",
" of",
" the",
" caus",
"al",
" relationship",
" established",
" in",
" point",
" ",
"1",
".",
" Therefore",
",",
" it",
" is",
" certain",
" that",
" No",
"ora",
" will",
" get",
" rad",
"base",
"."
],
"data_removed": false,
"error": null,
"source": "web",
"status": "succeeded",
"created_at": "2023-09-13T11:09:10.340502Z",
"started_at": "2023-09-13T11:09:10.333165Z",
"completed_at": "2023-09-13T11:09:17.993137Z",
"urls": {
"cancel": "https://api.replicate.com/v1/predictions/7hfon4dbjldslyor7p6ark6sae/cancel",
"get": "https://api.replicate.com/v1/predictions/7hfon4dbjldslyor7p6ark6sae"
},
"metrics": {
"predict_time": 7.659972,
"total_time": 7.652635
}
}