Language Models are Unsupervised Multitask Learners (GPT-2)
by OpenAI · free · Last verified 2026-03-17
Introduced GPT-2, demonstrating that large language models trained on diverse web text can perform zero-shot transfer across many NLP tasks without task-specific fine-tuning. Showed emergent capabilities at scale and sparked debate on responsible AI release.
https://openai.com/research/language-unsupervised ↗B+
B+—Good
Adoption: AQuality: AFreshness: DCitations: A+Engagement: F
Specifications
- License
- MIT
- Pricing
- free
- Capabilities
- text-generation, zero-shot-learning, summarization, question-answering
- Integrations
- huggingface-transformers
- Use Cases
- text-generation, summarization, zero-shot-nlp
- API Available
- No
- Tags
- gpt-2, language-modeling, zero-shot, generative, openai, foundational
- Added
- 2026-03-17
- Completeness
- 100%
Index Score
75.8Adoption
88
Quality
88
Freshness
38
Citations
92
Engagement
0
Put AI to work for your business
Deploy this paper alongside autonomous AaaS agents that handle tasks end-to-end — no babysitting required.