tomasmcm
/
gorilla-openfunctions-v1
Source: gorilla-llm/gorilla-openfunctions-v1 ✦ Quant: TheBloke/gorilla-openfunctions-v1-AWQ ✦ Extend Large Language Model (LLM) Chat Completion feature to formulate executable APIs call given natural language instructions and API context
Run tomasmcm/gorilla-openfunctions-v1 with an API
Use one of our client libraries to get started quickly.
Set the REPLICATE_API_TOKEN
environment variable
export REPLICATE_API_TOKEN=<paste-your-token-here>
Learn more about authentication
Install Replicate’s Node.js client library
npm install replicate
Run tomasmcm/gorilla-openfunctions-v1 using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
import Replicate from "replicate";
const replicate = new Replicate();
const input = {
prompt: "USER: <<question>> Call me an Uber ride type \"Plus\" in Berkeley at zipcode 94704 in 10 minutes <<function>> [{\"name\": \"Uber Carpool\", \"api_name\": \"uber.ride\", \"description\": \"Find suitable ride for customers given the location, type of ride, and the amount of time the customer is willing to wait as parameters\", \"parameters\": [{\"name\": \"loc\", \"description\": \"Location of the starting place of the Uber ride\"}, {\"name\": \"type\", \"enum\": [\"plus\", \"comfort\", \"black\"], \"description\": \"Types of Uber ride user is ordering\"}, {\"name\": \"time\", \"description\": \"The amount of time in minutes the customer is willing to wait\"}]}]\nASSISTANT: "
};
const output = await replicate.run("tomasmcm/gorilla-openfunctions-v1:574ca2dfccd6ea5006ae008b773dd9d5a1da77f978fe183f4f6452bbc9f62aba", { input });
console.log(output)
//=> "uber.ride(loc=\"94704\", type=\"plus\", time=10)"