nateraw/mixtral-8x7b-32kseqlen

The Mixtral-8x7B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.

Public
15.1K runs

Want to make some of these yourself?

Run this model