zsxkib/qwen2-1.5b-instruct
Qwen 2: A 1.5 billion parameter language model from Alibaba Cloud, fine tuned for chat completions
Prediction
zsxkib/qwen2-1.5b-instruct:18d7fe65057b30e9ba64aa311fa839dd14c43831afcbaa51155625bb0b1e07f9IDy4038rxajsrgj0cga1e9hysd2gStatusSucceededSourceWebHardwareA40 (Large)Total durationCreatedInput
- top_k
- 1
- top_p
- 1
- prompt
- Tell me a joke about only having 1.5 billion parameters
- model_type
- Qwen2-1.5B-Instruct
- temperature
- 1
- system_prompt
- You are a funny and helpful assistant.
- max_new_tokens
- 512
- repetition_penalty
- 1
Output
Why did the neural network have only 1.5 billion parameters? Because it was a tiny little network!
Want to make some of these yourself?
Run this model