You're looking at a specific version of this model. Jump to the model overview.
lucataco /emu3.5-image:f142f38b
Input schema
The fields you can use to run this model with an API. If you don’t give a value for a field its default value will be used.
| Field | Type | Default value | Description |
|---|---|---|---|
| task_type |
None
|
t2i
|
Task template to apply for generation.
|
| output_format |
None
|
png
|
Preferred output image format.
|
| seed |
integer
|
42
|
Random seed for reproducibility.
|
| top_k |
integer
|
131072
|
Top-k filter for generic tokens.
|
| top_p |
number
|
1
Max: 1 |
Nucleus sampling top-p for generic tokens.
|
| prompt |
string
|
User prompt to condition the model.
|
|
| text_top_k |
integer
|
1024
|
Top-k sampling applied to text tokens.
|
| text_top_p |
number
|
0.9
Max: 1 |
Top-p sampling applied to text tokens.
|
| image_top_k |
integer
|
10240
|
Top-k sampling applied to image tokens.
|
| image_top_p |
number
|
1
Max: 1 |
Top-p sampling applied to image tokens.
|
| temperature |
number
|
1
Max: 2 |
Sampling temperature applied to all tokens.
|
| guidance_scale |
number
|
5
Max: 10 |
Classifier-free guidance scale.
|
| max_new_tokens |
integer
|
4096
Min: 512 Max: 32768 |
Maximum number of tokens to autoregressively generate.
|
| reference_image |
string
|
Optional reference image used for image-conditioned tasks (required for x2i).
|
|
| text_temperature |
number
|
1
Max: 2 |
Temperature for text token sampling.
|
| image_temperature |
number
|
1
Max: 2 |
Temperature for image token sampling.
|
Output schema
The shape of the response you’ll get when you run this model with an API.
Schema
{'items': {'format': 'uri', 'type': 'string'},
'title': 'Output',
'type': 'array'}