Attention Is All You Need
by Google Brain · free · Last verified 2026-03-17
Introduced the Transformer architecture, replacing RNNs with self-attention for sequence-to-sequence tasks. This paper fundamentally changed the field of NLP and became the foundation for all modern large language models.
https://arxiv.org/abs/1706.03762 ↗A
A—Great
Adoption: A+Quality: A+Freshness: DCitations: A+Engagement: F
Specifications
- License
- Open Access
- Pricing
- free
- Capabilities
- sequence-modeling, attention-mechanism, machine-translation
- Integrations
- Use Cases
- machine-translation, text-generation, language-modeling
- API Available
- No
- Tags
- transformers, attention, nlp, foundational, architecture
- Added
- 2026-03-17
- Completeness
- 100%
Index Score
84.1Adoption
99
Quality
99
Freshness
35
Citations
99
Engagement
0
Put AI to work for your business
Deploy this paper alongside autonomous AaaS agents that handle tasks end-to-end — no babysitting required.