Skip to main content
ToolAI Tools & APIsv1.48

LiteLLM

by BerriAI · open-source · Last verified 2026-03-17

Unified API proxy for calling 100+ LLM providers using the OpenAI format. Provides load balancing, fallbacks, spend tracking, and rate limiting across multiple model providers.

https://litellm.ai
B
BAbove Average
Adoption: B+Quality: AFreshness: A+Citations: BEngagement: F

Specifications

License
MIT
Pricing
open-source
Capabilities
multi-provider-routing, load-balancing, spend-tracking, rate-limiting, fallback-management
Integrations
openai, anthropic, together-ai, groq, hugging-face, langchain
Use Cases
api-unification, cost-optimization, provider-switching, enterprise-gateway
API Available
Yes
SDK Languages
python
Deployment
self-hosted, docker, cloud
Rate Limits
N/A (open-source, self-managed)
Data Privacy
Self-hosted, user-managed; no data sent to BerriAI
Tags
llm-proxy, multi-provider, api-gateway, load-balancing
Added
2026-03-17
Completeness
100%

Index Score

61.45
Adoption
72
Quality
82
Freshness
90
Citations
65
Engagement
0

Explore the full AI ecosystem on Agents as a Service