Skip to main content
ModelLLMsv1.7B

SmolLM 1.7B

by Hugging Face · open-source · Last verified 2026-03-17

SmolLM 1.7B is Hugging Face's compact and highly capable language model trained on a curated dataset called SmolLM-Corpus consisting of high-quality web, code, and math data. It achieves remarkable performance for its size and was designed specifically for efficient on-device deployment and real-time inference.

https://huggingface.co/HuggingFaceTB/SmolLM-1.7B
C
CBelow Average
Adoption: C+Quality: B+Freshness: ACitations: CEngagement: F

Specifications

License
Apache 2.0
Pricing
open-source
Capabilities
text-generation, code-generation, math-reasoning, on-device-inference
Integrations
Hugging Face, Transformers.js, Ollama
Use Cases
browser-side AI inference, mobile AI assistants, code completion on device, offline AI applications
API Available
Yes
Parameters
~1.7B
Context Window
2K
Modalities
text
Training Cutoff
2024
Tags
small, edge, huggingface, efficient, on-device
Added
2026-03-17
Completeness
100%

Index Score

48.4
Adoption
55
Quality
72
Freshness
82
Citations
48
Engagement
0

Explore the full AI ecosystem on Agents as a Service