cjwbw / c4ai-command-r-v01

CohereForAI c4ai-command-r-v01, Quantized model through bitsandbytes, 8-bit precision

  • Public
  • 54 runs
  • GitHub
  • License

Model Summary

Model card: https://huggingface.co/CohereForAI/c4ai-command-r-v01

This version is Quantized model through bitsandbytes, 8-bit precision

C4AI Command-R is a research release of a 35 billion parameter highly performant generative model. Command-R is a large language model with open weights optimized for a variety of use cases including reasoning, summarization, and question answering. Command-R has the capability for multilingual generation evaluated in 10 languages and highly performant RAG capabilities.

Developed by: Cohere and Cohere For AI

Model Details

Input: Models input text only.

Output: Models generate text only.

Model Architecture: This is an auto-regressive language model that uses an optimized transformer architecture. After pretraining, this model uses supervised fine-tuning (SFT) and preference training to align model behavior to human preferences for helpfulness and safety.

Languages covered: The model is optimized to perform well in the following languages: English, French, Spanish, Italian, German, Brazilian Portuguese, Japanese, Korean, Simplified Chinese, and Arabic.

Pre-training data additionally included the following 13 languages: Russian, Polish, Turkish, Vietnamese, Dutch, Czech, Indonesian, Ukrainian, Romanian, Greek, Hindi, Hebrew, Persian.

Context length: Command-R supports a context length of 128K.