Falcon 180B
by Technology Innovation Institute · open-source · Last verified 2026-03-17
Technology Innovation Institute's 180 billion parameter open model trained on 3.5 trillion tokens of RefinedWeb data. One of the largest openly available language models, demonstrating that carefully curated web data can rival proprietary training sets.
https://huggingface.co/tiiuae/falcon-180B ↗C
C—Below Average
Adoption: CQuality: BFreshness: DCitations: C+Engagement: F
Specifications
- License
- Falcon-180B TII License
- Pricing
- open-source
- Capabilities
- text-generation, reasoning, code-generation, summarization, question-answering
- Integrations
- huggingface, vllm, text-generation-inference
- Use Cases
- text-generation, research, enterprise-ai, content-creation
- API Available
- No
- Parameters
- 180B
- Context Window
- 2K tokens
- Modalities
- text
- Training Cutoff
- Early 2023
- Tags
- llm, open-source, large-scale, tii, web-data
- Added
- 2026-03-17
- Completeness
- 88%
Index Score
46.55Adoption
48
Quality
68
Freshness
35
Citations
55
Engagement
0