nateraw
/
llama-2-70b-chat-awq
llama-2-70b-chat quantized with AWQ and served with vLLM
Want to make some of these yourself?
Run this modelllama-2-70b-chat quantized with AWQ and served with vLLM
Want to make some of these yourself?
Run this model