nateraw / mixtral-8x7b-32kseqlen

The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts. (Updated 1 year, 6 months ago)

  • Public
  • 15.1K runs
  • GitHub
Iterate in playground