Models
bart-large-cnn
BetaBART is a transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder.
bge-base-en-v1.5
BAAI general embedding (Base) model that transforms any given text into a 768-dimensional vector
bge-large-en-v1.5
BAAI general embedding (Large) model that transforms any given text into a 1024-dimensional vector
bge-m3
Multi-Functionality, Multi-Linguality, and Multi-Granularity embeddings model.
bge-small-en-v1.5
BAAI general embedding (Small) model that transforms any given text into a 384-dimensional vector
bge-reranker-base
Different from embedding model, reranker uses question and document as input and directly output similarity instead of embedding. You can use it to re-rank retrieved documents.
deepseek-coder-6.7b-base-awq
BetaDeepseek Coder is composed of a series of code language models, each trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language.
deepseek-coder-6.7b-instruct-awq
BetaDeepseek Coder is composed of a series of code language models, each trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language.
deepseek-math-7b-instruct
BetaDeepSeekMath-Instruct 7B is a mathematically instructed tuning model derived from DeepSeekMath-Base 7B. DeepSeekMath is initialized with DeepSeek-Coder weights.
deepseek-r1-distill-qwen-32b
DeepSeek-R1-Distill-Qwen-32B is a model distilled from DeepSeek-R1 based on Qwen2.5. It outperforms OpenAI-o1-mini across various benchmarks.
detr-resnet-50
BetaDetection Transformer (DETR) model trained end-to-end on COCO 2017 object detection (118k annotated images).
distilbert-sst-2-int8
Distilled BERT model that was finetuned on SST-2 for sentiment classification
discolm-german-7b-v1-awq
BetaDiscoLM German 7b is a Mistral-based large language model with a focus on German-language applications. AWQ is an efficient, accurate quantization method.
dreamshaper-8-lcm
BetaStable Diffusion model that has been fine-tuned to be better at photorealism without sacrificing range.