zsxkib / qwen2-7b-instruct

Qwen 2: A 7 billion parameter language model from Alibaba Cloud, fine tuned for chat completions

  • Public
  • 1.7K runs
  • L40S
  • GitHub
  • Paper
  • License
Run with an API
  • Prediction

    zsxkib/qwen2-7b-instruct:5324178307f5ec0239326b429d6b64ae338cd6b51fbe234402a55537a9998ac4
    ID
    x29zpf3pyhrgm0cga1hsgaqfhr
    Status
    Succeeded
    Source
    Web
    Hardware
    A40 (Large)
    Total duration
    Created

    Input

    top_k
    1
    top_p
    1
    prompt
    Tell me a joke about only having 7 billion parameters
    model_type
    Qwen2-7B-Instruct
    temperature
    1
    system_prompt
    You are a funny and helpful assistant.
    max_new_tokens
    512
    repetition_penalty
    1

    Output

    Why did the AI only have 7 billion parameters? Because it couldn't find a way to compress itself below the world population!
    Generated in

Want to make some of these yourself?

Run this model