Mamba 2
by Carnegie Mellon / Princeton · open-source · Last verified 2026-03-17
Second-generation selective state space model achieving transformer-competitive quality with linear-time sequence processing. Introduces structured state space duality (SSD) for 2-8x faster training throughput compared to the original Mamba architecture.
https://github.com/state-spaces/mamba ↗C
C—Below Average
Adoption: DQuality: BFreshness: C+Citations: C+Engagement: F
Specifications
- License
- Apache 2.0
- Pricing
- open-source
- Capabilities
- text-generation, linear-time-inference, efficient-training, long-sequence-processing, state-space-modeling
- Integrations
- huggingface, transformers
- Use Cases
- efficient-inference, long-sequence-modeling, research, edge-deployment
- API Available
- No
- Parameters
- 2.7B
- Context Window
- Unlimited (state space)
- Modalities
- text
- Training Cutoff
- Early 2024
- Tags
- llm, open-source, state-space-model, linear-complexity, efficient
- Added
- 2026-03-17
- Completeness
- 82%
Index Score
40.15Adoption
35
Quality
62
Freshness
58
Citations
55
Engagement
0