Skip to main content
ModelLLMsvMistral-Nemo-Instruct-2407

Mistral Nemo

by Mistral AI · open-source · Last verified 2026-03-17

A 12B parameter model co-developed by Mistral AI and NVIDIA featuring the novel Tekken tokenizer for improved multilingual efficiency. Designed as a drop-in replacement for Mistral 7B with superior performance.

https://mistral.ai/news/mistral-nemo/
C+
C+Average
Adoption: BQuality: B+Freshness: BCitations: BEngagement: F

Specifications

License
Apache 2.0
Pricing
open-source
Capabilities
text-generation, code-generation, instruction-following, multilingual, function-calling
Integrations
huggingface, ollama, vllm, nvidia-nim
Use Cases
chatbots, multilingual-applications, code-generation, classification, summarization
API Available
Yes
Parameters
12B
Context Window
128K tokens
Modalities
text
Training Cutoff
Early 2024
Tags
llm, open-source, mid-size, tekken-tokenizer, nvidia, mistral
Added
2026-03-17
Completeness
100%

Index Score

55.8
Adoption
65
Quality
74
Freshness
65
Citations
60
Engagement
0

Explore the full AI ecosystem on Agents as a Service