DBRX
by Databricks · open-source · Last verified 2026-03-17
Databricks' open mixture-of-experts model with 132B total parameters and 36B active per token, delivering strong performance with efficient inference. Built on a fine-grained MoE architecture with 16 experts and 4 active for optimal compute utilization.
https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm ↗C
C—Below Average
Adoption: CQuality: B+Freshness: CCitations: C+Engagement: F
Specifications
- License
- Databricks Open Model License
- Pricing
- open-source
- Capabilities
- text-generation, code-generation, reasoning, mixture-of-experts, efficient-inference
- Integrations
- databricks, huggingface, vllm, langchain
- Use Cases
- enterprise-ai, code-generation, data-analysis, text-generation
- API Available
- Yes
- Parameters
- 132B (36B active)
- Context Window
- 32K tokens
- Modalities
- text
- Training Cutoff
- Late 2023
- Tags
- llm, open-source, mixture-of-experts, databricks, enterprise
- Added
- 2026-03-17
- Completeness
- 90%
Index Score
44.9Adoption
45
Quality
72
Freshness
48
Citations
50
Engagement
0