nomagick / chatglm3-6b
A 6B parameter open bilingual chat LLM | 开源双语对话语言模型
15.3K runs
Public
nomagick / chatglm3-6b-32k
A 6B parameter open bilingual chat LLM (optimized for 8k+ context) | 开源双语对话语言模型
327 runs
Public
nomagick / jina-embeddings-v2
A 8k sequence length text embedding set trained by Jina AI
105 runs
Public
nomagick / qwen-vl-chat
Qwen-VL-Chat but with raw ChatML prompt interface and streaming
1K runs
Public
nomagick / qwen-14b-chat
Qwen-14B-Chat is a Transformer-based large language model, which is pretrained on a large volume of data, including web texts, books, codes, etc.
5.3K runs
Public
nomagick / jina-embeddings
Embedding models that has been trained using Jina AI's Linnaeus-Clean dataset.
33 runs
Public
nomagick / chatglm2-6b
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
10.6K runs
Public
nomagick / chatglm2-6b-int4
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型 (int4)
207 runs
Public