![](https://tjzk.replicate.delivery/models_models_cover_image/e6a761e7-49c9-4012-bd3e-6e5e8fea9352/replicate-prediction-gtkjtzlbv5kz.png)
nomagick/chatglm3-6b
A 6B parameter open bilingual chat LLM | 开源双语对话语言模型
14.7K runs
Public
![](https://tjzk.replicate.delivery/models_models_cover_image/09b6cb10-8543-4bc2-bfc4-bd21a49e53c2/replicate-prediction-ctmgindbbbr6.png)
nomagick/chatglm3-6b-32k
A 6B parameter open bilingual chat LLM (optimized for 8k+ context) | 开源双语对话语言模型
327 runs
Public
![](https://tjzk.replicate.delivery/models_models_cover_image/3743fc17-19e7-4228-b729-ce716046627e/replicate-prediction-fwafuhtb7dcl.png)
nomagick/jina-embeddings-v2
A 8k sequence length text embedding set trained by Jina AI
105 runs
Public
![](https://tjzk.replicate.delivery/models_models_cover_image/33841b6b-e21e-4746-9483-d38a9e4f54e6/replicate-prediction-dvht5blbrfyi.png)
nomagick/qwen-vl-chat
Qwen-VL-Chat but with raw ChatML prompt interface and streaming
735 runs
Public
![](https://tjzk.replicate.delivery/models_models_cover_image/3d482cbd-3da2-474f-8347-5d2621e6d85b/replicate-prediction-enqaaudbhjo5.png)
nomagick/qwen-14b-chat
Qwen-14B-Chat is a Transformer-based large language model, which is pretrained on a large volume of data, including web texts, books, codes, etc.
4.7K runs
Public
![](https://tjzk.replicate.delivery/models_models_cover_image/379a45b5-1964-48e2-8804-f29f1b8c0e45/replicate-prediction-pbuteetbgmdt.png)
nomagick/jina-embeddings
Embedding models that has been trained using Jina AI's Linnaeus-Clean dataset.
33 runs
Public
![](https://tjzk.replicate.delivery/models_models_cover_image/4228bfbc-bcb7-404d-8586-726c31f7073c/kqm9ddydl8_1689082483825.png)
nomagick/chatglm2-6b
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
10.6K runs
Public
![](https://tjzk.replicate.delivery/models_models_cover_image/ca04cf6a-f6d1-4bf0-b07b-c68fa3d55bfe/kqm9ddydl8_1689082483825.png)
nomagick/chatglm2-6b-int4
ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型 (int4)
207 runs
Public