Skip to main content
brand
context
industry
strategy
AaaS
PaperLLMsv1.0

Mamba: Linear-Time Sequence Modeling with Selective State Spaces

by Carnegie Mellon University / Together AI · free · Last verified 2026-03-17

Mamba is a novel sequence modeling architecture based on structured state space models (SSMs). It introduces a selection mechanism that allows the model to selectively propagate or forget information based on the input, overcoming a key limitation of previous SSMs. This enables Mamba to achieve Transformer-level performance with linear time complexity and significantly faster inference.

https://arxiv.org/abs/2312.00752
B
BAbove Average
Adoption: B+Quality: A+Freshness: ACitations: BEngagement: F

Specifications

License
Apache 2.0
Pricing
free
Capabilities
sequence-modeling, linear-time-complexity, fast-autoregressive-inference, selective-state-spaces, long-context-handling, causal-language-modeling, efficient-hardware-aware-training, attention-free-architecture, recurrent-and-parallel-computation-modes
Integrations
Use Cases
[object Object], [object Object], [object Object], [object Object]
API Available
No
Tags
mamba, state-space-model, ssm, linear-time, selective-state-space, recurrence, transformer-alternative, long-context, sequence-model, efficient-inference, ai-architecture
Added
2026-03-17
Completeness
0.95%

Index Score

63.8
Adoption
72
Quality
90
Freshness
80
Citations
68
Engagement
0

Need this tool deployed for your team?

Get a Custom Setup

Explore the full AI ecosystem on Agents as a Service