E5-Mistral-7B
by Microsoft Research · open-source · Last verified 2026-04-24
E5-Mistral-7B is a 7B-parameter embedding model from Microsoft Research that fine-tunes Mistral-7B using the E5 training recipe with synthetic data generation. It achieves state-of-the-art results on MTEB benchmarks, demonstrating that decoder-based LLMs can serve as powerful embedding models through instruction tuning.
https://huggingface.co/intfloat/e5-mistral-7b-instruct ↗C
C—Below Average
Adoption: C+Quality: B+Freshness: ACitations: CEngagement: F
Specifications
- License
- Open Source
- Pricing
- open-source
- Capabilities
- Integrations
- Use Cases
- API Available
- No
- Parameters
- 7B
- Context Window
- 32K tokens
- Modalities
- text
- Training Cutoff
- Mid 2023
- Tags
- embeddings, mistral, microsoft, open-source, mteb, instruction-tuned
- Added
- 2026-04-24
- Completeness
- 60%
Index Score
44Adoption
50
Quality
70
Freshness
80
Citations
40
Engagement
0