Groq
by Groq · freemium · Last verified 2026-03-17
Groq is an AI inference company that provides ultra-fast access to open-source large language models. It leverages its custom-designed Language Processing Unit (LPU) hardware to deliver industry-leading token generation speeds, significantly reducing latency for real-time applications via an OpenAI-compatible API.
https://groq.com ↗B
B—Above Average
Adoption: B+Quality: AFreshness: A+Citations: B+Engagement: F
Specifications
- License
- Proprietary
- Pricing
- freemium
- Capabilities
- ultra-fast-inference, high-token-throughput, lpu-hardware-acceleration, openai-compatible-api, support-for-open-source-llms, streaming-responses, json-mode, function-calling-and-tool-use, deterministic-performance
- Integrations
- [object Object], [object Object], [object Object], [object Object]
- Use Cases
- [object Object], [object Object], [object Object], [object Object], [object Object]
- API Available
- Yes
- SDK Languages
- python, typescript
- Deployment
- cloud
- Rate Limits
- Free tier: 30 req/min; paid plans: higher limits
- Data Privacy
- SOC 2 Type II compliant; no training on user data
- Tags
- inference, lpu, low-latency, fast-inference, llm-api, hardware-acceleration, real-time-ai, token-generation, conversational-ai, open-source-models, api-platform
- Added
- 2026-03-17
- Completeness
- 0.9%
Index Score
64.5Adoption
75
Quality
85
Freshness
92
Citations
70
Engagement
0