A 6B parameter open bilingual chat LLM | 开源双语对话语言模型
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型 (int4)
A 6B parameter open bilingual chat LLM (optimized for 8k+ context) | 开源双语对话语言模型
Embedding models that has been trained using Jina AI's Linnaeus-Clean dataset.
A 8k sequence length text embedding set trained by Jina AI
Qwen-14B-Chat is a Transformer-based large language model, which is pretrained on a large volume of data, including web texts, books, codes, etc.
Qwen-VL-Chat but with raw ChatML prompt interface and streaming
This model is cold. You'll get a fast response if the model is warm and already running, and a slower response if the model is cold and starting up.