lucataco / phixtral-2x2_8

phixtral-2x2_8 is the first Mixure of Experts (MoE) made with two microsoft/phi-2 models, inspired by the mistralai/Mixtral-8x7B-v0.1 architecture

  • Public
  • 250 runs
  • GitHub
  • License

Want to make some of these yourself?

Run this model