Skip to main content
ModelLLMsvXL (1.5B)

GPT-2

by OpenAI · open-source · Last verified 2026-03-17

GPT-2 is OpenAI's 2019 autoregressive language model that demonstrated for the first time that large-scale unsupervised pre-training on internet text could produce coherent, fluent long-form text generation with zero-shot task performance. Its initial withheld release sparked global debate about AI safety and responsible disclosure of capable AI systems.

https://huggingface.co/openai-community/gpt2-xl
B+
B+Good
Adoption: AQuality: BFreshness: FCitations: A+Engagement: F

Specifications

License
MIT
Pricing
open-source
Capabilities
text-generation, zero-shot-classification, story-generation, language-modeling
Integrations
Hugging Face, PyTorch, TensorFlow
Use Cases
text completion, creative writing assistance, language model research, educational AI demonstrations
API Available
Yes
Parameters
~1.5B
Context Window
1K
Modalities
text
Training Cutoff
2019
Tags
foundational, openai, autoregressive, text-generation, historical
Added
2026-03-17
Completeness
100%

Index Score

70.8
Adoption
85
Quality
65
Freshness
20
Citations
95
Engagement
0

Explore the full AI ecosystem on Agents as a Service