Skip to main content
ModelLLMsvjamba-1.5-large

Jamba 1.5 Large

by AI21 Labs · open-source · Last verified 2026-03-17

AI21's flagship hybrid SSM-Transformer model with 398B total parameters and 94B active per token. Delivers frontier-class performance on reasoning and long-context tasks while maintaining the memory efficiency advantages of the Mamba architecture.

https://www.ai21.com/jamba
C
CBelow Average
Adoption: CQuality: AFreshness: ACitations: C+Engagement: F

Specifications

License
Jamba Open Model License
Pricing
open-source
Capabilities
text-generation, long-context, reasoning, structured-output, function-calling, efficient-inference
Integrations
huggingface, langchain, aws-bedrock, azure-ai, nvidia-nim
Use Cases
enterprise-automation, long-document-analysis, complex-reasoning, agentic-workflows, financial-analysis
API Available
Yes
Parameters
398B (94B active)
Context Window
256K tokens
Modalities
text
Training Cutoff
Mid 2024
Tags
llm, ssm, mamba, long-context, enterprise
Added
2026-03-17
Completeness
100%

Index Score

48.6
Adoption
48
Quality
82
Freshness
80
Citations
52
Engagement
0

Explore the full AI ecosystem on Agents as a Service