Skip to main content
ModelLLMsv5-Eagle-1.5B

RWKV-5 1.5B

by RWKV Foundation · open-source · Last verified 2026-03-17

RWKV-5 (Eagle) is a 1.5 billion parameter model from the RWKV architecture family that combines the parallelism of transformers during training with the efficiency of RNNs during inference. It achieves linear time and constant memory complexity, making it exceptionally efficient for long-context tasks and edge deployment without the quadratic cost of attention mechanisms.

https://huggingface.co/RWKV/v5-Eagle-World-1B5-v2-20240520-ctx4096
C
CBelow Average
Adoption: CQuality: BFreshness: BCitations: C+Engagement: F

Specifications

License
Apache 2.0
Pricing
open-source
Capabilities
text-generation, long-context-processing, constant-memory-inference, multilingual-generation
Integrations
Hugging Face, RWKV.cpp, Ollama
Use Cases
long document processing, streaming inference on edge, multilingual assistants, memory-constrained deployment
API Available
Yes
Parameters
~1.5B
Context Window
4K
Modalities
text
Training Cutoff
2024
Tags
rwkv, rnn, efficient, open-source, linear-attention
Added
2026-03-17
Completeness
100%

Index Score

44.7
Adoption
42
Quality
67
Freshness
68
Citations
58
Engagement
0

Explore the full AI ecosystem on Agents as a Service