You're looking at a specific version of this model. Jump to the model overview.

fofr /sandbox:f6800331

Input schema

The fields you can use to run this model with an API. If you don’t give a value for a field its default value will be used.

Field Type Default value Description
prompt
string
Prompt for generated image. If you include the `trigger_word` used in the training process you are more likely to activate the trained object, style, or concept in the resulting image.
flux_layers_to_patch
string
double_blocks.0.img_mod.lin.weight=1.0 double_blocks.0.img_mod.lin.bias=1.0 double_blocks.0.img_attn.qkv.weight=1.0 double_blocks.0.img_attn.qkv.bias=1.0 double_blocks.0.img_attn.proj.weight=1.0 double_blocks.0.img_attn.proj.bias=1.0 double_blocks.0.img_mlp.0.weight=1.0 double_blocks.0.img_mlp.0.bias=1.0 double_blocks.0.img_mlp.2.weight=1.0 double_blocks.0.img_mlp.2.bias=1.0 double_blocks.0.txt_mod.lin.weight=1.0 double_blocks.0.txt_mod.lin.bias=1.0 double_blocks.0.txt_attn.qkv.weight=1.0 double_blocks.0.txt_attn.qkv.bias=1.0 double_blocks.0.txt_attn.proj.weight=1.0 double_blocks.0.txt_attn.proj.bias=1.0 double_blocks.0.txt_mlp.0.weight=1.0 double_blocks.0.txt_mlp.0.bias=1.0 double_blocks.0.txt_mlp.2.weight=1.0 double_blocks.0.txt_mlp.2.bias=1.0 double_blocks.1.img_mod.lin.weight=1.0 double_blocks.1.img_mod.lin.bias=1.0 double_blocks.1.img_attn.qkv.weight=1.0 double_blocks.1.img_attn.qkv.bias=1.0 double_blocks.1.img_attn.proj.weight=1.0 double_blocks.1.img_attn.proj.bias=1.0 double_blocks.1.img_mlp.0.weight=1.0 double_blocks.1.img_mlp.0.bias=1.0 double_blocks.1.img_mlp.2.weight=1.0 double_blocks.1.img_mlp.2.bias=1.0 double_blocks.1.txt_mod.lin.weight=1.0 double_blocks.1.txt_mod.lin.bias=1.0 double_blocks.1.txt_attn.qkv.weight=1.0 double_blocks.1.txt_attn.qkv.bias=1.0 double_blocks.1.txt_attn.proj.weight=1.0 double_blocks.1.txt_attn.proj.bias=1.0 double_blocks.1.txt_mlp.0.weight=1.0 double_blocks.1.txt_mlp.0.bias=1.0 double_blocks.1.txt_mlp.2.weight=1.0 double_blocks.1.txt_mlp.2.bias=1.0 double_blocks.2.img_mod.lin.weight=1.0 double_blocks.2.img_mod.lin.bias=1.0 double_blocks.2.img_attn.qkv.weight=1.0 double_blocks.2.img_attn.qkv.bias=1.0 double_blocks.2.img_attn.proj.weight=1.0 double_blocks.2.img_attn.proj.bias=1.0 double_blocks.2.img_mlp.0.weight=1.0 double_blocks.2.img_mlp.0.bias=1.0 double_blocks.2.img_mlp.2.weight=1.0 double_blocks.2.img_mlp.2.bias=1.0 double_blocks.2.txt_mod.lin.weight=1.0 double_blocks.2.txt_mod.lin.bias=1.0 double_blocks.2.txt_attn.qkv.weight=1.0 double_blocks.2.txt_attn.qkv.bias=1.0 double_blocks.2.txt_attn.proj.weight=1.0 double_blocks.2.txt_attn.proj.bias=1.0 double_blocks.2.txt_mlp.0.weight=1.0 double_blocks.2.txt_mlp.0.bias=1.0 double_blocks.2.txt_mlp.2.weight=1.0 double_blocks.2.txt_mlp.2.bias=1.0 double_blocks.3.img_mod.lin.weight=1.0 double_blocks.3.img_mod.lin.bias=1.0 double_blocks.3.img_attn.qkv.weight=1.0 double_blocks.3.img_attn.qkv.bias=1.0 double_blocks.3.img_attn.proj.weight=1.0 double_blocks.3.img_attn.proj.bias=1.0 double_blocks.3.img_mlp.0.weight=1.0 double_blocks.3.img_mlp.0.bias=1.0 double_blocks.3.img_mlp.2.weight=1.0 double_blocks.3.img_mlp.2.bias=1.0 double_blocks.3.txt_mod.lin.weight=1.0 double_blocks.3.txt_mod.lin.bias=1.0 double_blocks.3.txt_attn.qkv.weight=1.0 double_blocks.3.txt_attn.qkv.bias=1.0 double_blocks.3.txt_attn.proj.weight=1.0 double_blocks.3.txt_attn.proj.bias=1.0 double_blocks.3.txt_mlp.0.weight=1.0 double_blocks.3.txt_mlp.0.bias=1.0 double_blocks.3.txt_mlp.2.weight=1.0 double_blocks.3.txt_mlp.2.bias=1.0 double_blocks.4.img_mod.lin.weight=1.0 double_blocks.4.img_mod.lin.bias=1.0 double_blocks.4.img_attn.qkv.weight=1.0 double_blocks.4.img_attn.qkv.bias=1.0 double_blocks.4.img_attn.proj.weight=1.0 double_blocks.4.img_attn.proj.bias=1.0 double_blocks.4.img_mlp.0.weight=1.0 double_blocks.4.img_mlp.0.bias=1.0 double_blocks.4.img_mlp.2.weight=1.0 double_blocks.4.img_mlp.2.bias=1.0 double_blocks.4.txt_mod.lin.weight=1.0 double_blocks.4.txt_mod.lin.bias=1.0 double_blocks.4.txt_attn.qkv.weight=1.0 double_blocks.4.txt_attn.qkv.bias=1.0 double_blocks.4.txt_attn.proj.weight=1.0 double_blocks.4.txt_attn.proj.bias=1.0 double_blocks.4.txt_mlp.0.weight=1.0 double_blocks.4.txt_mlp.0.bias=1.0 double_blocks.4.txt_mlp.2.weight=1.0 double_blocks.4.txt_mlp.2.bias=1.0 double_blocks.5.img_mod.lin.weight=1.0 double_blocks.5.img_mod.lin.bias=1.0 double_blocks.5.img_attn.qkv.weight=1.0 double_blocks.5.img_attn.qkv.bias=1.0 double_blocks.5.img_attn.proj.weight=1.0 double_blocks.5.img_attn.proj.bias=1.0 double_blocks.5.img_mlp.0.weight=1.0 double_blocks.5.img_mlp.0.bias=1.0 double_blocks.5.img_mlp.2.weight=1.0 double_blocks.5.img_mlp.2.bias=1.0 double_blocks.5.txt_mod.lin.weight=1.0 double_blocks.5.txt_mod.lin.bias=1.0 double_blocks.5.txt_attn.qkv.weight=1.0 double_blocks.5.txt_attn.qkv.bias=1.0 double_blocks.5.txt_attn.proj.weight=1.0 double_blocks.5.txt_attn.proj.bias=1.0 double_blocks.5.txt_mlp.0.weight=1.0 double_blocks.5.txt_mlp.0.bias=1.0 double_blocks.5.txt_mlp.2.weight=1.0 double_blocks.5.txt_mlp.2.bias=1.0 double_blocks.6.img_mod.lin.weight=1.0 double_blocks.6.img_mod.lin.bias=1.0 double_blocks.6.img_attn.qkv.weight=1.0 double_blocks.6.img_attn.qkv.bias=1.0 double_blocks.6.img_attn.proj.weight=1.0 double_blocks.6.img_attn.proj.bias=1.0 double_blocks.6.img_mlp.0.weight=1.0 double_blocks.6.img_mlp.0.bias=1.0 double_blocks.6.img_mlp.2.weight=1.0 double_blocks.6.img_mlp.2.bias=1.0 double_blocks.6.txt_mod.lin.weight=1.0 double_blocks.6.txt_mod.lin.bias=1.0 double_blocks.6.txt_attn.qkv.weight=1.0 double_blocks.6.txt_attn.qkv.bias=1.0 double_blocks.6.txt_attn.proj.weight=1.0 double_blocks.6.txt_attn.proj.bias=1.0 double_blocks.6.txt_mlp.0.weight=1.0 double_blocks.6.txt_mlp.0.bias=1.0 double_blocks.6.txt_mlp.2.weight=1.0 double_blocks.6.txt_mlp.2.bias=1.0 double_blocks.7.img_mod.lin.weight=1.0 double_blocks.7.img_mod.lin.bias=1.0 double_blocks.7.img_attn.qkv.weight=1.0 double_blocks.7.img_attn.qkv.bias=1.0 double_blocks.7.img_attn.proj.weight=1.0 double_blocks.7.img_attn.proj.bias=1.0 double_blocks.7.img_mlp.0.weight=1.0 double_blocks.7.img_mlp.0.bias=1.0 double_blocks.7.img_mlp.2.weight=1.0 double_blocks.7.img_mlp.2.bias=1.0 double_blocks.7.txt_mod.lin.weight=1.0 double_blocks.7.txt_mod.lin.bias=1.0 double_blocks.7.txt_attn.qkv.weight=1.0 double_blocks.7.txt_attn.qkv.bias=1.0 double_blocks.7.txt_attn.proj.weight=1.0 double_blocks.7.txt_attn.proj.bias=1.0 double_blocks.7.txt_mlp.0.weight=1.0 double_blocks.7.txt_mlp.0.bias=1.0 double_blocks.7.txt_mlp.2.weight=1.0 double_blocks.7.txt_mlp.2.bias=1.0 double_blocks.8.img_mod.lin.weight=1.0 double_blocks.8.img_mod.lin.bias=1.0 double_blocks.8.img_attn.qkv.weight=1.0 double_blocks.8.img_attn.qkv.bias=1.0 double_blocks.8.img_attn.proj.weight=1.0 double_blocks.8.img_attn.proj.bias=1.0 double_blocks.8.img_mlp.0.weight=1.0 double_blocks.8.img_mlp.0.bias=1.0 double_blocks.8.img_mlp.2.weight=1.0 double_blocks.8.img_mlp.2.bias=1.0 double_blocks.8.txt_mod.lin.weight=1.0 double_blocks.8.txt_mod.lin.bias=1.0 double_blocks.8.txt_attn.qkv.weight=1.0 double_blocks.8.txt_attn.qkv.bias=1.0 double_blocks.8.txt_attn.proj.weight=1.0 double_blocks.8.txt_attn.proj.bias=1.0 double_blocks.8.txt_mlp.0.weight=1.0 double_blocks.8.txt_mlp.0.bias=1.0 double_blocks.8.txt_mlp.2.weight=1.0 double_blocks.8.txt_mlp.2.bias=1.0 double_blocks.9.img_mod.lin.weight=1.0 double_blocks.9.img_mod.lin.bias=1.0 double_blocks.9.img_attn.qkv.weight=1.0 double_blocks.9.img_attn.qkv.bias=1.0 double_blocks.9.img_attn.proj.weight=1.0 double_blocks.9.img_attn.proj.bias=1.0 double_blocks.9.img_mlp.0.weight=1.0 double_blocks.9.img_mlp.0.bias=1.0 double_blocks.9.img_mlp.2.weight=1.0 double_blocks.9.img_mlp.2.bias=1.0 double_blocks.9.txt_mod.lin.weight=1.0 double_blocks.9.txt_mod.lin.bias=1.0 double_blocks.9.txt_attn.qkv.weight=1.0 double_blocks.9.txt_attn.qkv.bias=1.0 double_blocks.9.txt_attn.proj.weight=1.0 double_blocks.9.txt_attn.proj.bias=1.0 double_blocks.9.txt_mlp.0.weight=1.0 double_blocks.9.txt_mlp.0.bias=1.0 double_blocks.9.txt_mlp.2.weight=1.0 double_blocks.9.txt_mlp.2.bias=1.0 double_blocks.10.img_mod.lin.weight=1.0 double_blocks.10.img_mod.lin.bias=1.0 double_blocks.10.img_attn.qkv.weight=1.0 double_blocks.10.img_attn.qkv.bias=1.0 double_blocks.10.img_attn.proj.weight=1.0 double_blocks.10.img_attn.proj.bias=1.0 double_blocks.10.img_mlp.0.weight=1.0 double_blocks.10.img_mlp.0.bias=1.0 double_blocks.10.img_mlp.2.weight=1.0 double_blocks.10.img_mlp.2.bias=1.0 double_blocks.10.txt_mod.lin.weight=1.0 double_blocks.10.txt_mod.lin.bias=1.0 double_blocks.10.txt_attn.qkv.weight=1.0 double_blocks.10.txt_attn.qkv.bias=1.0 double_blocks.10.txt_attn.proj.weight=1.0 double_blocks.10.txt_attn.proj.bias=1.0 double_blocks.10.txt_mlp.0.weight=1.0 double_blocks.10.txt_mlp.0.bias=1.0 double_blocks.10.txt_mlp.2.weight=1.0 double_blocks.10.txt_mlp.2.bias=1.0 double_blocks.11.img_mod.lin.weight=1.0 double_blocks.11.img_mod.lin.bias=1.0 double_blocks.11.img_attn.qkv.weight=1.0 double_blocks.11.img_attn.qkv.bias=1.0 double_blocks.11.img_attn.proj.weight=1.0 double_blocks.11.img_attn.proj.bias=1.0 double_blocks.11.img_mlp.0.weight=1.0 double_blocks.11.img_mlp.0.bias=1.0 double_blocks.11.img_mlp.2.weight=1.0 double_blocks.11.img_mlp.2.bias=1.0 double_blocks.11.txt_mod.lin.weight=1.0 double_blocks.11.txt_mod.lin.bias=1.0 double_blocks.11.txt_attn.qkv.weight=1.0 double_blocks.11.txt_attn.qkv.bias=1.0 double_blocks.11.txt_attn.proj.weight=1.0 double_blocks.11.txt_attn.proj.bias=1.0 double_blocks.11.txt_mlp.0.weight=1.0 double_blocks.11.txt_mlp.0.bias=1.0 double_blocks.11.txt_mlp.2.weight=1.0 double_blocks.11.txt_mlp.2.bias=1.0 double_blocks.12.img_mod.lin.weight=1.0 double_blocks.12.img_mod.lin.bias=1.0 double_blocks.12.img_attn.qkv.weight=1.0 double_blocks.12.img_attn.qkv.bias=1.0 double_blocks.12.img_attn.proj.weight=1.0 double_blocks.12.img_attn.proj.bias=1.0 double_blocks.12.img_mlp.0.weight=1.0 double_blocks.12.img_mlp.0.bias=1.0 double_blocks.12.img_mlp.2.weight=1.0 double_blocks.12.img_mlp.2.bias=1.0 double_blocks.12.txt_mod.lin.weight=1.0 double_blocks.12.txt_mod.lin.bias=1.0 double_blocks.12.txt_attn.qkv.weight=1.0 double_blocks.12.txt_attn.qkv.bias=1.0 double_blocks.12.txt_attn.proj.weight=1.0 double_blocks.12.txt_attn.proj.bias=1.0 double_blocks.12.txt_mlp.0.weight=1.0 double_blocks.12.txt_mlp.0.bias=1.0 double_blocks.12.txt_mlp.2.weight=1.0 double_blocks.12.txt_mlp.2.bias=1.0 double_blocks.13.img_mod.lin.weight=1.0 double_blocks.13.img_mod.lin.bias=1.0 double_blocks.13.img_attn.qkv.weight=1.0 double_blocks.13.img_attn.qkv.bias=1.0 double_blocks.13.img_attn.proj.weight=1.0 double_blocks.13.img_attn.proj.bias=1.0 double_blocks.13.img_mlp.0.weight=1.0 double_blocks.13.img_mlp.0.bias=1.0 double_blocks.13.img_mlp.2.weight=1.0 double_blocks.13.img_mlp.2.bias=1.0 double_blocks.13.txt_mod.lin.weight=1.0 double_blocks.13.txt_mod.lin.bias=1.0 double_blocks.13.txt_attn.qkv.weight=1.0 double_blocks.13.txt_attn.qkv.bias=1.0 double_blocks.13.txt_attn.proj.weight=1.0 double_blocks.13.txt_attn.proj.bias=1.0 double_blocks.13.txt_mlp.0.weight=1.0 double_blocks.13.txt_mlp.0.bias=1.0 double_blocks.13.txt_mlp.2.weight=1.0 double_blocks.13.txt_mlp.2.bias=1.0 double_blocks.14.img_mod.lin.weight=1.0 double_blocks.14.img_mod.lin.bias=1.0 double_blocks.14.img_attn.qkv.weight=1.0 double_blocks.14.img_attn.qkv.bias=1.0 double_blocks.14.img_attn.proj.weight=1.0 double_blocks.14.img_attn.proj.bias=1.0 double_blocks.14.img_mlp.0.weight=1.0 double_blocks.14.img_mlp.0.bias=1.0 double_blocks.14.img_mlp.2.weight=1.0 double_blocks.14.img_mlp.2.bias=1.0 double_blocks.14.txt_mod.lin.weight=1.0 double_blocks.14.txt_mod.lin.bias=1.0 double_blocks.14.txt_attn.qkv.weight=1.0 double_blocks.14.txt_attn.qkv.bias=1.0 double_blocks.14.txt_attn.proj.weight=1.0 double_blocks.14.txt_attn.proj.bias=1.0 double_blocks.14.txt_mlp.0.weight=1.0 double_blocks.14.txt_mlp.0.bias=1.0 double_blocks.14.txt_mlp.2.weight=1.0 double_blocks.14.txt_mlp.2.bias=1.0 double_blocks.15.img_mod.lin.weight=1.0 double_blocks.15.img_mod.lin.bias=1.0 double_blocks.15.img_attn.qkv.weight=1.0 double_blocks.15.img_attn.qkv.bias=1.0 double_blocks.15.img_attn.proj.weight=1.0 double_blocks.15.img_attn.proj.bias=1.0 double_blocks.15.img_mlp.0.weight=1.0 double_blocks.15.img_mlp.0.bias=1.0 double_blocks.15.img_mlp.2.weight=1.0 double_blocks.15.img_mlp.2.bias=1.0 double_blocks.15.txt_mod.lin.weight=1.0 double_blocks.15.txt_mod.lin.bias=1.0 double_blocks.15.txt_attn.qkv.weight=1.0 double_blocks.15.txt_attn.qkv.bias=1.0 double_blocks.15.txt_attn.proj.weight=1.0 double_blocks.15.txt_attn.proj.bias=1.0 double_blocks.15.txt_mlp.0.weight=1.0 double_blocks.15.txt_mlp.0.bias=1.0 double_blocks.15.txt_mlp.2.weight=1.0 double_blocks.15.txt_mlp.2.bias=1.0 double_blocks.16.img_mod.lin.weight=1.0 double_blocks.16.img_mod.lin.bias=1.0 double_blocks.16.img_attn.qkv.weight=1.0 double_blocks.16.img_attn.qkv.bias=1.0 double_blocks.16.img_attn.proj.weight=1.0 double_blocks.16.img_attn.proj.bias=1.0 double_blocks.16.img_mlp.0.weight=1.0 double_blocks.16.img_mlp.0.bias=1.0 double_blocks.16.img_mlp.2.weight=1.0 double_blocks.16.img_mlp.2.bias=1.0 double_blocks.16.txt_mod.lin.weight=1.0 double_blocks.16.txt_mod.lin.bias=1.0 double_blocks.16.txt_attn.qkv.weight=1.0 double_blocks.16.txt_attn.qkv.bias=1.0 double_blocks.16.txt_attn.proj.weight=1.0 double_blocks.16.txt_attn.proj.bias=1.0 double_blocks.16.txt_mlp.0.weight=1.0 double_blocks.16.txt_mlp.0.bias=1.0 double_blocks.16.txt_mlp.2.weight=1.0 double_blocks.16.txt_mlp.2.bias=1.0 double_blocks.17.img_mod.lin.weight=1.0 double_blocks.17.img_mod.lin.bias=1.0 double_blocks.17.img_attn.qkv.weight=1.0 double_blocks.17.img_attn.qkv.bias=1.0 double_blocks.17.img_attn.proj.weight=1.0 double_blocks.17.img_attn.proj.bias=1.0 double_blocks.17.img_mlp.0.weight=1.0 double_blocks.17.img_mlp.0.bias=1.0 double_blocks.17.img_mlp.2.weight=1.0 double_blocks.17.img_mlp.2.bias=1.0 double_blocks.17.txt_mod.lin.weight=1.0 double_blocks.17.txt_mod.lin.bias=1.0 double_blocks.17.txt_attn.qkv.weight=1.0 double_blocks.17.txt_attn.qkv.bias=1.0 double_blocks.17.txt_attn.proj.weight=1.0 double_blocks.17.txt_attn.proj.bias=1.0 double_blocks.17.txt_mlp.0.weight=1.0 double_blocks.17.txt_mlp.0.bias=1.0 double_blocks.17.txt_mlp.2.weight=1.0 double_blocks.17.txt_mlp.2.bias=1.0 double_blocks.18.img_mod.lin.weight=1.0 double_blocks.18.img_mod.lin.bias=1.0 double_blocks.18.img_attn.qkv.weight=1.0 double_blocks.18.img_attn.qkv.bias=1.0 double_blocks.18.img_attn.proj.weight=1.0 double_blocks.18.img_attn.proj.bias=1.0 double_blocks.18.img_mlp.0.weight=1.0 double_blocks.18.img_mlp.0.bias=1.0 double_blocks.18.img_mlp.2.weight=1.0 double_blocks.18.img_mlp.2.bias=1.0 double_blocks.18.txt_mod.lin.weight=1.0 double_blocks.18.txt_mod.lin.bias=1.0 double_blocks.18.txt_attn.qkv.weight=1.0 double_blocks.18.txt_attn.qkv.bias=1.0 double_blocks.18.txt_attn.proj.weight=1.0 double_blocks.18.txt_attn.proj.bias=1.0 double_blocks.18.txt_mlp.0.weight=1.0 double_blocks.18.txt_mlp.0.bias=1.0 double_blocks.18.txt_mlp.2.weight=1.0 double_blocks.18.txt_mlp.2.bias=1.0 single_blocks.0.linear1.weight=1.0 single_blocks.0.linear1.bias=1.0 single_blocks.0.linear2.weight=1.0 single_blocks.0.linear2.bias=1.0 single_blocks.0.modulation.lin.weight=1.0 single_blocks.0.modulation.lin.bias=1.0 single_blocks.1.linear1.weight=1.0 single_blocks.1.linear1.bias=1.0 single_blocks.1.linear2.weight=1.0 single_blocks.1.linear2.bias=1.0 single_blocks.1.modulation.lin.weight=1.0 single_blocks.1.modulation.lin.bias=1.0 single_blocks.2.linear1.weight=1.0 single_blocks.2.linear1.bias=1.0 single_blocks.2.linear2.weight=1.0 single_blocks.2.linear2.bias=1.0 single_blocks.2.modulation.lin.weight=1.0 single_blocks.2.modulation.lin.bias=1.0 single_blocks.3.linear1.weight=1.0 single_blocks.3.linear1.bias=1.0 single_blocks.3.linear2.weight=1.0 single_blocks.3.linear2.bias=1.0 single_blocks.3.modulation.lin.weight=1.0 single_blocks.3.modulation.lin.bias=1.0 single_blocks.4.linear1.weight=1.0 single_blocks.4.linear1.bias=1.0 single_blocks.4.linear2.weight=1.0 single_blocks.4.linear2.bias=1.0 single_blocks.4.modulation.lin.weight=1.0 single_blocks.4.modulation.lin.bias=1.0 single_blocks.5.linear1.weight=1.0 single_blocks.5.linear1.bias=1.0 single_blocks.5.linear2.weight=1.0 single_blocks.5.linear2.bias=1.0 single_blocks.5.modulation.lin.weight=1.0 single_blocks.5.modulation.lin.bias=1.0 single_blocks.6.linear1.weight=1.0 single_blocks.6.linear1.bias=1.0 single_blocks.6.linear2.weight=1.0 single_blocks.6.linear2.bias=1.0 single_blocks.6.modulation.lin.weight=1.0 single_blocks.6.modulation.lin.bias=1.0 single_blocks.7.linear1.weight=1.0 single_blocks.7.linear1.bias=1.0 single_blocks.7.linear2.weight=1.0 single_blocks.7.linear2.bias=1.0 single_blocks.7.modulation.lin.weight=1.0 single_blocks.7.modulation.lin.bias=1.0 single_blocks.8.linear1.weight=1.0 single_blocks.8.linear1.bias=1.0 single_blocks.8.linear2.weight=1.0 single_blocks.8.linear2.bias=1.0 single_blocks.8.modulation.lin.weight=1.0 single_blocks.8.modulation.lin.bias=1.0 single_blocks.9.linear1.weight=1.0 single_blocks.9.linear1.bias=1.0 single_blocks.9.linear2.weight=1.0 single_blocks.9.linear2.bias=1.0 single_blocks.9.modulation.lin.weight=1.0 single_blocks.9.modulation.lin.bias=1.0 single_blocks.10.linear1.weight=1.0 single_blocks.10.linear1.bias=1.0 single_blocks.10.linear2.weight=1.0 single_blocks.10.linear2.bias=1.0 single_blocks.10.modulation.lin.weight=1.0 single_blocks.10.modulation.lin.bias=1.0 single_blocks.11.linear1.weight=1.0 single_blocks.11.linear1.bias=1.0 single_blocks.11.linear2.weight=1.0 single_blocks.11.linear2.bias=1.0 single_blocks.11.modulation.lin.weight=1.0 single_blocks.11.modulation.lin.bias=1.0 single_blocks.12.linear1.weight=1.0 single_blocks.12.linear1.bias=1.0 single_blocks.12.linear2.weight=1.0 single_blocks.12.linear2.bias=1.0 single_blocks.12.modulation.lin.weight=1.0 single_blocks.12.modulation.lin.bias=1.0 single_blocks.13.linear1.weight=1.0 single_blocks.13.linear1.bias=1.0 single_blocks.13.linear2.weight=1.0 single_blocks.13.linear2.bias=1.0 single_blocks.13.modulation.lin.weight=1.0 single_blocks.13.modulation.lin.bias=1.0 single_blocks.14.linear1.weight=1.0 single_blocks.14.linear1.bias=1.0 single_blocks.14.linear2.weight=1.0 single_blocks.14.linear2.bias=1.0 single_blocks.14.modulation.lin.weight=1.0 single_blocks.14.modulation.lin.bias=1.0 single_blocks.15.linear1.weight=1.0 single_blocks.15.linear1.bias=1.0 single_blocks.15.linear2.weight=1.0 single_blocks.15.linear2.bias=1.0 single_blocks.15.modulation.lin.weight=1.0 single_blocks.15.modulation.lin.bias=1.0 single_blocks.16.linear1.weight=1.0 single_blocks.16.linear1.bias=1.0 single_blocks.16.linear2.weight=1.0 single_blocks.16.linear2.bias=1.0 single_blocks.16.modulation.lin.weight=1.0 single_blocks.16.modulation.lin.bias=1.0 single_blocks.17.linear1.weight=1.0 single_blocks.17.linear1.bias=1.0 single_blocks.17.linear2.weight=1.0 single_blocks.17.linear2.bias=1.0 single_blocks.17.modulation.lin.weight=1.0 single_blocks.17.modulation.lin.bias=1.0 single_blocks.18.linear1.weight=1.0 single_blocks.18.linear1.bias=1.0 single_blocks.18.linear2.weight=1.0 single_blocks.18.linear2.bias=1.0 single_blocks.18.modulation.lin.weight=1.0 single_blocks.18.modulation.lin.bias=1.0 single_blocks.19.linear1.weight=1.0 single_blocks.19.linear1.bias=1.0 single_blocks.19.linear2.weight=1.0 single_blocks.19.linear2.bias=1.0 single_blocks.19.modulation.lin.weight=1.0 single_blocks.19.modulation.lin.bias=1.0 single_blocks.20.linear1.weight=1.0 single_blocks.20.linear1.bias=1.0 single_blocks.20.linear2.weight=1.0 single_blocks.20.linear2.bias=1.0 single_blocks.20.modulation.lin.weight=1.0 single_blocks.20.modulation.lin.bias=1.0 single_blocks.21.linear1.weight=1.0 single_blocks.21.linear1.bias=1.0 single_blocks.21.linear2.weight=1.0 single_blocks.21.linear2.bias=1.0 single_blocks.21.modulation.lin.weight=1.0 single_blocks.21.modulation.lin.bias=1.0 single_blocks.22.linear1.weight=1.0 single_blocks.22.linear1.bias=1.0 single_blocks.22.linear2.weight=1.0 single_blocks.22.linear2.bias=1.0 single_blocks.22.modulation.lin.weight=1.0 single_blocks.22.modulation.lin.bias=1.0 single_blocks.23.linear1.weight=1.0 single_blocks.23.linear1.bias=1.0 single_blocks.23.linear2.weight=1.0 single_blocks.23.linear2.bias=1.0 single_blocks.23.modulation.lin.weight=1.0 single_blocks.23.modulation.lin.bias=1.0 single_blocks.24.linear1.weight=1.0 single_blocks.24.linear1.bias=1.0 single_blocks.24.linear2.weight=1.0 single_blocks.24.linear2.bias=1.0 single_blocks.24.modulation.lin.weight=1.0 single_blocks.24.modulation.lin.bias=1.0 single_blocks.25.linear1.weight=1.0 single_blocks.25.linear1.bias=1.0 single_blocks.25.linear2.weight=1.0 single_blocks.25.linear2.bias=1.0 single_blocks.25.modulation.lin.weight=1.0 single_blocks.25.modulation.lin.bias=1.0 single_blocks.26.linear1.weight=1.0 single_blocks.26.linear1.bias=1.0 single_blocks.26.linear2.weight=1.0 single_blocks.26.linear2.bias=1.0 single_blocks.26.modulation.lin.weight=1.0 single_blocks.26.modulation.lin.bias=1.0 single_blocks.27.linear1.weight=1.0 single_blocks.27.linear1.bias=1.0 single_blocks.27.linear2.weight=1.0 single_blocks.27.linear2.bias=1.0 single_blocks.27.modulation.lin.weight=1.0 single_blocks.27.modulation.lin.bias=1.0 single_blocks.28.linear1.weight=1.0 single_blocks.28.linear1.bias=1.0 single_blocks.28.linear2.weight=1.0 single_blocks.28.linear2.bias=1.0 single_blocks.28.modulation.lin.weight=1.0 single_blocks.28.modulation.lin.bias=1.0 single_blocks.29.linear1.weight=1.0 single_blocks.29.linear1.bias=1.0 single_blocks.29.linear2.weight=1.0 single_blocks.29.linear2.bias=1.0 single_blocks.29.modulation.lin.weight=1.0 single_blocks.29.modulation.lin.bias=1.0 single_blocks.30.linear1.weight=1.0 single_blocks.30.linear1.bias=1.0 single_blocks.30.linear2.weight=1.0 single_blocks.30.linear2.bias=1.0 single_blocks.30.modulation.lin.weight=1.0 single_blocks.30.modulation.lin.bias=1.0 single_blocks.31.linear1.weight=1.0 single_blocks.31.linear1.bias=1.0 single_blocks.31.linear2.weight=1.0 single_blocks.31.linear2.bias=1.0 single_blocks.31.modulation.lin.weight=1.0 single_blocks.31.modulation.lin.bias=1.0 single_blocks.32.linear1.weight=1.0 single_blocks.32.linear1.bias=1.0 single_blocks.32.linear2.weight=1.0 single_blocks.32.linear2.bias=1.0 single_blocks.32.modulation.lin.weight=1.0 single_blocks.32.modulation.lin.bias=1.0 single_blocks.33.linear1.weight=1.0 single_blocks.33.linear1.bias=1.0 single_blocks.33.linear2.weight=1.0 single_blocks.33.linear2.bias=1.0 single_blocks.33.modulation.lin.weight=1.0 single_blocks.33.modulation.lin.bias=1.0 single_blocks.34.linear1.weight=1.0 single_blocks.34.linear1.bias=1.0 single_blocks.34.linear2.weight=1.0 single_blocks.34.linear2.bias=1.0 single_blocks.34.modulation.lin.weight=1.0 single_blocks.34.modulation.lin.bias=1.0 single_blocks.35.linear1.weight=1.0 single_blocks.35.linear1.bias=1.0 single_blocks.35.linear2.weight=1.0 single_blocks.35.linear2.bias=1.0 single_blocks.35.modulation.lin.weight=1.0 single_blocks.35.modulation.lin.bias=1.0 single_blocks.36.linear1.weight=1.0 single_blocks.36.linear1.bias=1.0 single_blocks.36.linear2.weight=1.0 single_blocks.36.linear2.bias=1.0 single_blocks.36.modulation.lin.weight=1.0 single_blocks.36.modulation.lin.bias=1.0 single_blocks.37.linear1.weight=1.0 single_blocks.37.linear1.bias=1.0 single_blocks.37.linear2.weight=1.0 single_blocks.37.linear2.bias=1.0 single_blocks.37.modulation.lin.weight=1.0 single_blocks.37.modulation.lin.bias
None
flux_model_layers_to_patch
string (enum)
all

Options:

all, custom (use a regular expression), double_block layers, double_block text layers, double_block image layers, single_block layers, attn (attention) layers, mlp (multi-layer perceptron) layers, mod (modulation) layers, proj (projection) layers, qkv (query, key, value) layers, lin (linear) layers, double_block 0, double_block 1, double_block 2, double_block 3, double_block 4, double_block 5, double_block 6, double_block 7, double_block 8, double_block 9, double_block 10, double_block 11, double_block 12, double_block 13, double_block 14, double_block 15, double_block 16, double_block 17, double_block 18, single_block 0, single_block 1, single_block 2, single_block 3, single_block 4, single_block 5, single_block 6, single_block 7, single_block 8, single_block 9, single_block 10, single_block 11, single_block 12, single_block 13, single_block 14, single_block 15, single_block 16, single_block 17, single_block 18, single_block 19, single_block 20, single_block 21, single_block 22, single_block 23, single_block 24, single_block 25, single_block 26, single_block 27, single_block 28, single_block 29, single_block 30, single_block 31, single_block 32, single_block 33, single_block 34, single_block 35, single_block 36, single_block 37

An enumeration.
flux_model_layers_value
number
1.01

Max: 2

The new value for the selected layers
weights_and_biases_to_patch
string (enum)
weights and biases

Options:

weights and biases, just weights, just biases

An enumeration.
flux_model_layers_regular_expressions
array
[]
A set of regular expressions and their target values. Each regular expression should be on a new line and the target value should be a float. For example: double_blocks\.([0-9]+)\.(img|txt)_(mod|attn|mlp)\.(lin|qkv|proj|0|2)\.(weight|bias)=1.01
aspect_ratio
string (enum)
1:1

Options:

1:1, 16:9, 21:9, 3:2, 2:3, 4:5, 5:4, 3:4, 4:3, 9:16, 9:21

Aspect ratio for the generated image in text-to-image mode. The size will always be 1 megapixel, i.e. 1024x1024 if aspect ratio is 1:1. To use arbitrary width and height, set aspect ratio to 'custom'. Note: Ignored in img2img and inpainting modes.
num_outputs
integer
1

Min: 1

Max: 4

Number of images to output.
num_inference_steps
integer
28

Min: 1

Max: 50

Number of inference steps. More steps can give more detailed images, but take longer.
guidance_scale
number
3

Max: 10

Guidance scale for the diffusion process. Lower values can give more realistic images. Good values to try are 2, 2.5, 3 and 3.5
max_shift
number
1.15

Max: 10

Maximum shift
base_shift
number
0.5

Max: 10

Base shift
output_format
string (enum)
webp

Options:

webp, jpg, png

Format of the output images
output_quality
integer
95

Max: 100

Quality of the output images, from 0 to 100. 100 is best quality, 0 is lowest quality.
seed
integer
Set a seed for reproducibility. Random by default.

Output schema

The shape of the response you’ll get when you run this model with an API.

Schema
{'items': {'format': 'uri', 'type': 'string'},
 'title': 'Output',
 'type': 'array'}