Jamba 1.5 Mini
by AI21 Labs · open-source · Last verified 2026-03-17
AI21's compact hybrid SSM-Transformer model combining Mamba and attention layers for efficient long-context processing. Handles up to 256K tokens with significantly lower memory usage than pure transformer models of similar quality.
https://www.ai21.com/jamba ↗C
C—Below Average
Adoption: C+Quality: B+Freshness: B+Citations: C+Engagement: F
Specifications
- License
- Jamba Open Model License
- Pricing
- open-source
- Capabilities
- text-generation, long-context, efficient-inference, reasoning, structured-output
- Integrations
- huggingface, langchain, aws-bedrock, azure-ai
- Use Cases
- long-document-analysis, summarization, enterprise-chatbots, research, data-extraction
- API Available
- Yes
- Parameters
- 52B (12B active)
- Context Window
- 256K tokens
- Modalities
- text
- Training Cutoff
- Mid 2024
- Tags
- llm, ssm, mamba, long-context, efficient
- Added
- 2026-03-17
- Completeness
- 100%
Index Score
49.35Adoption
52
Quality
74
Freshness
78
Citations
55
Engagement
0