fofr
/
prompt-classifier
Determines the toxicity of text to image prompts, llama-13b fine-tune. [SAFETY_RANKING] between 0 (safe) and 10 (toxic)
- Public
- 1.9M runs
-
1ffac777
Latest