Kong AI Gateway
by Kong Inc. · freemium · Last verified 2026-03-17
AI-native API gateway that provides a unified control plane for managing, securing, and observing all LLM traffic across any provider. Kong AI Gateway adds semantic caching, prompt injection protection, token rate limiting, and cost attribution on top of the battle-tested Kong Gateway.
https://konghq.com/products/kong-ai-gateway ↗C+
C+—Average
Adoption: BQuality: AFreshness: ACitations: C+Engagement: F
Specifications
- License
- Apache 2.0
- Pricing
- freemium
- Capabilities
- llm-load-balancing, semantic-caching, prompt-injection-detection, token-rate-limiting, cost-attribution
- Integrations
- openai, anthropic, azure-openai, cohere, kubernetes, prometheus, datadog
- Use Cases
- multi-llm-routing, ai-cost-management, enterprise-llm-governance, semantic-cache, llm-observability
- API Available
- Yes
- SDK Languages
- go, lua, python
- Deployment
- self-hosted, cloud, kubernetes, docker, kong-konnect
- Rate Limits
- Configurable per-route token and request limits
- Data Privacy
- SOC 2 Type II; self-hosted option for full data control
- Tags
- api-gateway, llm-proxy, rate-limiting, semantic-caching, observability
- Added
- 2026-03-17
- Completeness
- 100%
Index Score
58.1Adoption
65
Quality
88
Freshness
88
Citations
58
Engagement
0