Skip to main content
PaperLLMsv1.0

RoFormer: Enhanced Transformer with Rotary Position Embedding

by Zhuiyi Technology · free · Last verified 2026-03-17

Introduces Rotary Position Embedding (RoPE), encoding absolute position information with a rotation matrix and naturally incorporating relative position in self-attention. Adopted by LLaMA, PaLM 2, and most modern LLMs for its length generalization properties.

https://arxiv.org/abs/2104.09864
B+
B+Good
Adoption: AQuality: A+Freshness: BCitations: B+Engagement: F

Specifications

License
Open Access
Pricing
free
Capabilities
positional-encoding, long-context-generalization, relative-position-attention
Integrations
huggingface-transformers
Use Cases
long-context-language-modeling, sequence-modeling
API Available
No
Tags
rope, rotary-position-embedding, positional-encoding, transformers, long-context
Added
2026-03-17
Completeness
100%

Index Score

71.4
Adoption
86
Quality
90
Freshness
60
Citations
76
Engagement
0

Put AI to work for your business

Deploy this paper alongside autonomous AaaS agents that handle tasks end-to-end — no babysitting required.

Explore the full AI ecosystem on Agents as a Service