Skip to main content
ModelAI for Codev1.0

Codestral Mamba

by Mistral AI · open-source · Last verified 2026-03-17

Mistral AI's code-focused model built on the Mamba state-space architecture for linear-time inference scaling. Excels at long code sequences with theoretically unlimited context length and constant memory usage during generation.

https://mistral.ai/news/codestral-mamba
C
CBelow Average
Adoption: CQuality: BFreshness: C+Citations: CEngagement: F

Specifications

License
Apache 2.0
Pricing
open-source
Capabilities
code-generation, code-completion, long-context-code, linear-time-inference, code-explanation
Integrations
huggingface, mistral-api, ollama, vllm
Use Cases
code-generation, large-codebase-analysis, code-completion, repository-understanding
API Available
Yes
Parameters
7B
Context Window
256K tokens
Modalities
text
Training Cutoff
Early 2024
Tags
code-generation, open-source, mamba-architecture, state-space-model, mistral
Added
2026-03-17
Completeness
92%

Index Score

41.2
Adoption
45
Quality
66
Freshness
58
Citations
40
Engagement
0

Explore the full AI ecosystem on Agents as a Service