A 6B parameter open bilingual chat LLM (optimized for 8k+ context) | 开源双语对话语言模型
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型 (int4)
Embedding models that has been trained using Jina AI's Linnaeus-Clean dataset.
Qwen-14B-Chat is a Transformer-based large language model, which is pretrained on a large volume of data, including web texts, books, codes, etc.
Qwen-VL-Chat but with raw ChatML prompt interface and streaming
A 8k sequence length text embedding set trained by Jina AI
A 6B parameter open bilingual chat LLM | 开源双语对话语言模型
This model is not yet booted but ready for API calls. Your first API call will boot the model and may take longer, but after that subsequent responses will be fast.
This model runs on L40S.