zsxkib/qwen2-1.5b-instruct

Qwen 2: A 1.5 billion parameter language model from Alibaba Cloud, fine tuned for chat completions

Public
52.6K runs
  • Prediction

    zsxkib/qwen2-1.5b-instruct:18d7fe65057b30e9ba64aa311fa839dd14c43831afcbaa51155625bb0b1e07f9
    ID
    y4038rxajsrgj0cga1e9hysd2g
    Status
    Succeeded
    Source
    Web
    Hardware
    A40 (Large)
    Total duration
    Created

    Input

    top_k
    1
    top_p
    1
    prompt
    Tell me a joke about only having 1.5 billion parameters
    model_type
    Qwen2-1.5B-Instruct
    temperature
    1
    system_prompt
    You are a funny and helpful assistant.
    max_new_tokens
    512
    repetition_penalty
    1

    Output

    Why did the neural network have only 1.5 billion parameters? Because it was a tiny little network!
    Generated in

Want to make some of these yourself?

Run this model