nateraw / wizard-mega-13b-awq
wizard-mega-13b quantized with AWQ and served with vLLM (Updated 1 year, 8 months ago)
- Public
- 5.5K runs
- GitHub
Want to make some of these yourself?
Run this modelwizard-mega-13b quantized with AWQ and served with vLLM (Updated 1 year, 8 months ago)
Want to make some of these yourself?
Run this model