Skip to main content
ModelLLMsvDeepSeek-V3-0324

DeepSeek-V3

by DeepSeek · open-source · Last verified 2026-03-17

DeepSeek's frontier-class MoE model with 671B total parameters and 37B active, trained using FP8 mixed precision for unprecedented cost efficiency. Matches or exceeds GPT-4o and Claude 3.5 Sonnet on key benchmarks.

https://www.deepseek.com
B+
B+Good
Adoption: AQuality: A+Freshness: ACitations: AEngagement: F

Specifications

License
MIT
Pricing
open-source
Capabilities
text-generation, code-generation, reasoning, math, multilingual, function-calling
Integrations
huggingface, vllm, ollama, langchain, llama-index
Use Cases
frontier-research, code-generation, complex-reasoning, enterprise-deployment, math-problem-solving
API Available
Yes
Parameters
671B (37B active)
Context Window
128K tokens
Modalities
text
Training Cutoff
Late 2024
Tags
llm, open-source, moe, frontier, fp8-training, deepseek
Added
2026-03-17
Completeness
100%

Index Score

72.8
Adoption
82
Quality
90
Freshness
85
Citations
88
Engagement
0

Explore the full AI ecosystem on Agents as a Service