EnergyBench
by Lannelongue et al. / EMBL-EBI · open-source · Last verified 2026-03-17
EnergyBench quantifies the energy consumption and carbon footprint of AI inference across hardware and software configurations. It correlates task accuracy with joules consumed, enabling practitioners to make informed accuracy-efficiency trade-offs for sustainable AI deployment.
https://github.com/GreenAlgorithms/GreenAlgorithms4HPC ↗C
C—Below Average
Adoption: CQuality: AFreshness: B+Citations: C+Engagement: F
Specifications
- License
- MIT
- Pricing
- open-source
- Capabilities
- evaluation, energy-measurement, carbon-estimation
- Integrations
- codecarbon, mlco2
- Use Cases
- model-evaluation, sustainable-ai, efficiency-optimization
- API Available
- No
- Evaluated Models
- gpt-4o, llama-3-70b, phi-3-mini, mistral-7b
- Metrics
- joules-per-token, co2-grams-per-token, accuracy-per-joule
- Methodology
- Energy measured via hardware power sensors (RAPL, NVML) during standardized inference runs on MMLU. Joules-per-token and CO2-grams-per-token (using regional carbon intensity) compared against MMLU accuracy to compute accuracy-per-joule Pareto curves.
- Last Run
- 2026-02-18
- Tags
- energy, efficiency, sustainability, carbon, inference
- Added
- 2026-03-17
- Completeness
- 100%
Index Score
49Adoption
48
Quality
80
Freshness
79
Citations
55
Engagement
0