ExplorePricingDocsBlogChangelogSign in
ExplorePlaygroundBetaPricingDocsBlogChangelogSign in

nomagick

Yanlong Wang

GitHub
https://github.com/nomagick

nomagick / chatglm3-6b

A 6B parameter open bilingual chat LLM | 开源双语对话语言模型

15.3K runs
Public

nomagick / chatglm3-6b-32k

A 6B parameter open bilingual chat LLM (optimized for 8k+ context) | 开源双语对话语言模型

328 runs
Public

nomagick / jina-embeddings-v2

A 8k sequence length text embedding set trained by Jina AI

107 runs
Public

nomagick / qwen-vl-chat

Qwen-VL-Chat but with raw ChatML prompt interface and streaming

1.1K runs
Public

nomagick / qwen-14b-chat

Qwen-14B-Chat is a Transformer-based large language model, which is pretrained on a large volume of data, including web texts, books, codes, etc.

5.4K runs
Public

nomagick / jina-embeddings

Embedding models that has been trained using Jina AI's Linnaeus-Clean dataset.

36 runs
Public

nomagick / chatglm2-6b

ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型

10.7K runs
Public

nomagick / chatglm2-6b-int4

ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型 (int4)

209 runs
Public
Replicate
Home About Join us Terms Privacy Status Support