You're looking at a specific version of this model. Jump to the model overview.
tomasmcm /obsidian-3b-v0.5:4dd167e2
Input
Run this model in Node.js with one line of code:
npm install replicate
REPLICATE_API_TOKEN
environment variableexport REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
import Replicate from "replicate";
const replicate = new Replicate({
auth: process.env.REPLICATE_API_TOKEN,
});
Run tomasmcm/obsidian-3b-v0.5 using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run(
"tomasmcm/obsidian-3b-v0.5:4dd167e279923d3df56b9986b0adf8658f868787c64e4222dd4752865a51207e",
{
input: {
debug: false,
temperature: 0.2,
max_new_tokens: 512
}
}
);
console.log(output);
To learn more, take a look at the guide on getting started with Node.js.
pip install replicate
REPLICATE_API_TOKEN
environment variableexport REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
import replicate
Run tomasmcm/obsidian-3b-v0.5 using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
output = replicate.run(
"tomasmcm/obsidian-3b-v0.5:4dd167e279923d3df56b9986b0adf8658f868787c64e4222dd4752865a51207e",
input={
"debug": False,
"temperature": 0.2,
"max_new_tokens": 512
}
)
print(output)
To learn more, take a look at the guide on getting started with Python.
REPLICATE_API_TOKEN
environment variableexport REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
Run tomasmcm/obsidian-3b-v0.5 using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
curl -s -X POST \
-H "Authorization: Bearer $REPLICATE_API_TOKEN" \
-H "Content-Type: application/json" \
-H "Prefer: wait" \
-d $'{
"version": "4dd167e279923d3df56b9986b0adf8658f868787c64e4222dd4752865a51207e",
"input": {
"debug": false,
"temperature": 0.2,
"max_new_tokens": 512
}
}' \
https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
brew install cog
If you don’t have Homebrew, there are other installation options available.
Pull and run tomasmcm/obsidian-3b-v0.5 using Cog (this will download the full model and run it in your local environment):
cog predict r8.im/tomasmcm/obsidian-3b-v0.5@sha256:4dd167e279923d3df56b9986b0adf8658f868787c64e4222dd4752865a51207e \
-i 'debug=false' \
-i 'temperature=0.2' \
-i 'max_new_tokens=512'
To learn more, take a look at the Cog documentation.
Pull and run tomasmcm/obsidian-3b-v0.5 using Docker (this will download the full model and run it in your local environment):
docker run -d -p 5000:5000 --gpus=all r8.im/tomasmcm/obsidian-3b-v0.5@sha256:4dd167e279923d3df56b9986b0adf8658f868787c64e4222dd4752865a51207e
curl -s -X POST \ -H "Content-Type: application/json" \ -d $'{ "input": { "debug": false, "temperature": 0.2, "max_new_tokens": 512 } }' \ http://localhost:5000/predictions
Add a payment method to run this model.
Each run costs approximately $0.011. Alternatively, try out our featured models for free.
By signing in, you agree to our
terms of service and privacy policy
Output
No output yet! Press "Submit" to start a prediction.