BLOOM
by BigScience Workshop · open-source · Last verified 2026-03-17
BLOOM (BigScience Large Open-science Open-access Multilingual Language Model) is a 176 billion parameter multilingual language model created by a collaborative initiative of over 1,000 researchers from 70+ countries. Trained on 46 natural languages and 13 programming languages, it was one of the first truly open-access large language models of GPT-3 scale.
https://huggingface.co/bigscience/bloom ↗B
B—Above Average
Adoption: BQuality: B+Freshness: CCitations: AEngagement: F
Specifications
- License
- BigScience RAIL-M
- Pricing
- open-source
- Capabilities
- multilingual-text-generation, code-generation, text-completion, few-shot-learning
- Integrations
- Hugging Face, DeepSpeed, PyTorch
- Use Cases
- multilingual NLP research, low-resource language generation, academic LLM research baseline, open-source AI development
- API Available
- Yes
- Parameters
- ~176B
- Context Window
- 2K
- Modalities
- text
- Training Cutoff
- 2022
- Tags
- foundational, bigscience, multilingual, open-source, collaborative
- Added
- 2026-03-17
- Completeness
- 100%
Index Score
61.6Adoption
65
Quality
72
Freshness
40
Citations
85
Engagement
0