lucataco / phixtral-2x2_8

phixtral-2x2_8 is the first Mixure of Experts (MoE) made with two microsoft/phi-2 models, inspired by the mistralai/Mixtral-8x7B-v0.1 architecture (Updated 1 year, 4 months ago)

  • Public
  • 1.5K runs
  • GitHub
  • License
Iterate in playground