You're looking at a specific version of this model. Jump to the model overview.

lucataco /ollama-qwq:4bfc56cd

Input schema

The fields you can use to run this model with an API. If you don’t give a value for a field its default value will be used.

Field Type Default value Description
prompt
string
Input text for the model
temperature
number
0.7

Max: 1

Controls randomness. Lower values make the model more deterministic, higher values make it more random.
top_p
number
0.95

Max: 1

Controls diversity of the output. Lower values make the output more focused, higher values make it more diverse.
max_tokens
integer
512

Min: 1

Maximum number of tokens to generate

Output schema

The shape of the response you’ll get when you run this model with an API.

Schema
{'items': {'type': 'string'},
 'title': 'Output',
 'type': 'array',
 'x-cog-array-display': 'concatenate',
 'x-cog-array-type': 'iterator'}