Readme
See the official model card for more information: https://huggingface.co/CausalLM/14B
The model specifically being served here is this one from TheBloke: https://huggingface.co/TheBloke/CausalLM-14B-AWQ
CausalLM/14B model with AWQ quantization. Perhaps better than all existing models < 70B, in most quantitative evaluations...
This model has no enabled versions.
See the official model card for more information: https://huggingface.co/CausalLM/14B
The model specifically being served here is this one from TheBloke: https://huggingface.co/TheBloke/CausalLM-14B-AWQ