TAAFT
Free mode
100% free
Freemium
Free Trial
Create tool

DBRX Base

New Text Gen 7
Released: March 27, 2024

Overview

DBRX Base is Databricks’ open-weight Mixture-of-Experts foundation model (≈132B total parameters) built for long-context reasoning and coding. It’s the pretrained (unaligned) checkpoint—ideal for domain fine-tuning, RAG, and custom alignment—while DBRX Instruct is the chat-ready version.

Description

DBRX Base is a state-of-the-art MoE language model designed by Databricks to maximize quality and efficiency at scale. A router selects a small subset of experts per token (top-k routing), delivering high throughput with frontier-level accuracy while controlling latency and cost. The model supports long inputs (typically around a 32K context window) and is trained on a carefully filtered blend of code, math, technical web, and high-quality prose so it generalizes well to analysis and software tasks.

As a base model, it’s unaligned by default—meant to be adapted. Teams commonly:

Fine-tune with LoRA/DPO/RL to add style, tools, or policies

Ground with RAG for factual work over private corpora

Constrain outputs to JSON/DSLs for agents and automations

DBRX integrates smoothly with common stacks (vLLM/Transformers) and Databricks Mosaic AI for training/serving, with quantization options for cheaper inference. Use DBRX Base when you want maximum control over behavior; choose DBRX Instruct when you need a ready-to-use conversational model.

About Databricks

Databricks is a global data, analytics, and AI company that pioneered the data lakehouse architecture.

Industry: Information Technology
Company Size: 1001-5000
Location: San Francisco, CA, US
View Company Profile

Related Models

Last updated: October 3, 2025