nateraw / mixtral-8x7b-32kseqlen

The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.

  • Public
  • 15K runs
  • GitHub

Want to make some of these yourself?

Run this model