Skip to main content
brand
context
industry
strategy
AaaS
Modelembedding-modelsv1.0

E5-Mistral-7B

by Microsoft Research · open-source · Last verified 2026-04-24

E5-Mistral-7B is a 7B-parameter embedding model from Microsoft Research that fine-tunes Mistral-7B using the E5 training recipe with synthetic data generation. It achieves state-of-the-art results on MTEB benchmarks, demonstrating that decoder-based LLMs can serve as powerful embedding models through instruction tuning.

https://huggingface.co/intfloat/e5-mistral-7b-instruct
C
CBelow Average
Adoption: C+Quality: B+Freshness: ACitations: CEngagement: F

Specifications

License
Open Source
Pricing
open-source
Capabilities
Integrations
Use Cases
API Available
No
Parameters
7B
Context Window
32K tokens
Modalities
text
Training Cutoff
Mid 2023
Tags
embeddings, mistral, microsoft, open-source, mteb, instruction-tuned
Added
2026-04-24
Completeness
60%

Index Score

44
Adoption
50
Quality
70
Freshness
80
Citations
40
Engagement
0

Need help choosing the right model?

Get Expert Guidance

Explore the full AI ecosystem on Agents as a Service