Skip to main content
ModelLLMsvbase-uncased

BERT

by Google · open-source · Last verified 2026-03-17

BERT (Bidirectional Encoder Representations from Transformers) is Google's landmark 2018 language model that introduced the bidirectional pre-training paradigm using masked language modeling and next sentence prediction. It revolutionized NLP by demonstrating that a single pre-trained model could achieve state-of-the-art results across dozens of downstream tasks with minimal fine-tuning.

https://huggingface.co/google-bert/bert-base-uncased
B+
B+Good
Adoption: A+Quality: B+Freshness: DCitations: A+Engagement: F

Specifications

License
Apache 2.0
Pricing
open-source
Capabilities
text-classification, named-entity-recognition, question-answering, sentence-similarity
Integrations
Hugging Face, TensorFlow, PyTorch
Use Cases
text classification, NER, question answering, semantic search foundation, downstream fine-tuning
API Available
Yes
Parameters
~110M
Context Window
512 tokens
Modalities
text
Training Cutoff
2018
Tags
foundational, google, transformer, encoder, nlp
Added
2026-03-17
Completeness
100%

Index Score

76.3
Adoption
92
Quality
75
Freshness
30
Citations
98
Engagement
0

Explore the full AI ecosystem on Agents as a Service