Skip to main content
ModelLLMsvMixtral-8x22B-Instruct-v0.1

Mixtral 8x22B

by Mistral AI · open-source · Last verified 2026-03-17

Mistral AI's largest open-source mixture-of-experts model with 176B total parameters and 39B active per forward pass. Features native function calling and sets new open-source benchmarks for reasoning and code.

https://mistral.ai
B
BAbove Average
Adoption: B+Quality: AFreshness: C+Citations: B+Engagement: F

Specifications

License
Apache 2.0
Pricing
open-source
Capabilities
text-generation, code-generation, reasoning, function-calling, multilingual, math
Integrations
huggingface, vllm, together-ai, aws-bedrock
Use Cases
enterprise-deployment, complex-reasoning, code-generation, function-calling, research
API Available
Yes
Parameters
176B (39B active)
Context Window
64K tokens
Modalities
text
Training Cutoff
Mid 2024
Tags
llm, open-source, moe, large-model, function-calling, mistral
Added
2026-03-17
Completeness
100%

Index Score

64.4
Adoption
72
Quality
83
Freshness
55
Citations
76
Engagement
0

Explore the full AI ecosystem on Agents as a Service