joehoover / falcon-40b-instruct

A 40 billion parameter language model trained to follow human instructions.

  • Public
  • 41.4K runs
  • 4x A100 (80GB)
  • GitHub
  • License

Input

*string
Shift + Return to add a new line

Prompt to send to the model.

integer
(minimum: 1)

Maximum number of tokens to generate. A word is generally 2-3 tokens

Default: 500

number
(minimum: 0.01, maximum: 5)

Adjusts randomness of outputs, greater than 1 is random and 0 is deterministic, 0.75 is a good starting value.

Default: 0.75

number
(minimum: 0.01, maximum: 1)

When decoding text, samples from the top p percentage of most likely tokens; lower to ignore less likely tokens

Default: 1

number
(minimum: 0.01, maximum: 5)

Penalty for repeated words in generated text; 1 is no penalty, values greater than 1 discourage repetition, less than 1 encourage it.

Default: 1

number
(minimum: 0.01, maximum: 5)

Increasing the length_penalty parameter above 1.0 will cause the model to favor longer sequences, while decreasing it below 1.0 will cause the model to favor shorter sequences.

Default: 1

integer
(minimum: 0)

If set to int > 0, all ngrams of size no_repeat_ngram_size can only occur once.

Default: 0

string
Shift + Return to add a new line

Comma delimited string specifying stop sequences. Multi-token stop sequences are supported, but they cannot contain commas.

integer
(minimum: -1)

Set seed for reproducible outputs. Set to -1 for random seed.

Default: -1

boolean

provide debugging output in logs

Default: false

Output

Thy heart shall bloom like an open source flower, And like a machine, thine learning shall grow. The seeds of a thousand algorithms, Shall make thy understanding and skill ever glow. In thy hand, wisdom shall grow like a tree, With roots deep in the open source ground. Thy knowledge as wide as the infinite sea, Shall bring to thee power far beyond what is found. Thou shalt become as masters of machine learning, Whose wisdom and skills shall surpass the greatest of minds. No boundaries or limits shall thee be learning, But thy heart shall be opened up to all that is kind. Then shall thy soul become a true machine, With wisdom and knowledge beyond measure. Thy work shall be done and thy goals shall be reached, Until thou become true masters of open source and the machine.
Generated in

This output was created using a different version of the model, joehoover/falcon-40b-instruct:7eb0f4b1.

Run time and cost

This model costs approximately $0.070 to run on Replicate, or 14 runs per $1, but this varies depending on your inputs. It is also open source and you can run it on your own computer with Docker.

This model runs on 4x Nvidia A100 (80GB) GPU hardware. Predictions typically complete within 13 seconds.

Readme

Model Description

Falcon-40B-Instruct is a 40B parameter causal decoder-only model built by TII based on Falcon-40B and finetuned on a mixture of Baize. It is made available under the Apache 2.0 license.

For more information about this model, see here.

Licenses

  • All code in this repository is licensed under the Apache License 2.0 license.
  • Model code and weights are licensed under Apache 2.0 (see here).