LLM Load Testing
by AaaS · open-source · Last verified 2026-03-01
Load tests LLM API endpoints with configurable concurrency, request patterns, and duration. Measures throughput, latency percentiles (p50/p95/p99), time-to-first-token, error rates, and generates performance reports with degradation alerts.
https://aaas.blog/script/llm-load-testing ↗C+
C+—Average
Adoption: C+Quality: AFreshness: ACitations: CEngagement: F
Specifications
- License
- MIT
- Pricing
- open-source
- Capabilities
- concurrent-requests, latency-profiling, ttft-measurement, error-tracking, report-generation
- Integrations
- locust, aiohttp, openai, pandas
- Use Cases
- capacity-planning, sla-validation, performance-regression-testing, infrastructure-sizing
- API Available
- No
- Language
- python
- Dependencies
- locust, aiohttp, openai, pandas, matplotlib
- Environment
- Python 3.11+
- Est. Runtime
- 5-30 minutes depending on test duration
- Tags
- script, automation, load-testing, performance, benchmarking
- Added
- 2026-03-17
- Completeness
- 100%
Index Score
50.8Adoption
56
Quality
82
Freshness
80
Citations
48
Engagement
0