
nateraw / mixtral-8x7b-32kseqlen
The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.
- Public
- 15.1K runs
- GitHub
Want to make some of these yourself?
Run this modelThe Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.
Want to make some of these yourself?
Run this model