Skip to main content
ModelLLMsvphi-3.5-moe-instruct

Phi-3.5 MoE

by Microsoft · open-source · Last verified 2026-03-17

Microsoft's mixture-of-experts small language model with 42B total parameters but only 6.6B active per token. Delivers performance competitive with much larger dense models while maintaining efficient inference costs and strong multilingual capabilities.

https://azure.microsoft.com/en-us/products/phi
C+
C+Average
Adoption: BQuality: B+Freshness: B+Citations: BEngagement: F

Specifications

License
MIT
Pricing
open-source
Capabilities
text-generation, reasoning, multilingual, instruction-following, efficient-inference
Integrations
azure-ai, huggingface, vllm, langchain
Use Cases
cost-efficient-inference, multilingual-applications, enterprise-chatbots, document-processing
API Available
Yes
Parameters
42B (6.6B active)
Context Window
128K tokens
Modalities
text
Training Cutoff
Mid 2024
Tags
slm, mixture-of-experts, open-weight, efficient, multilingual
Added
2026-03-17
Completeness
100%

Index Score

55.4
Adoption
62
Quality
78
Freshness
72
Citations
60
Engagement
0

Explore the full AI ecosystem on Agents as a Service