AMD Instinct MI325X
by AMD · paid · Last verified 2026-03-17
Enhanced refresh of the MI300X with 256GB HBM3e memory and higher memory bandwidth. Targets hyperscalers and enterprises seeking maximum memory capacity for inference of the largest open models like Llama and Mixtral.
https://www.amd.com/en/products/accelerators/instinct/mi300/mi325x.html ↗C
C—Below Average
Adoption: CQuality: A+Freshness: ACitations: C+Engagement: F
Specifications
- License
- Proprietary
- Pricing
- paid
- Capabilities
- ai-training, inference, fp8-compute, high-capacity-memory
- Integrations
- rocm, pytorch, tensorflow, vllm
- Use Cases
- large-model-inference, llm-training, hpc
- API Available
- No
- Tags
- gpu, data-center, training, inference, amd, cdna3
- Added
- 2026-03-17
- Completeness
- 100%
Index Score
49.5Adoption
42
Quality
91
Freshness
88
Citations
58
Engagement
0
Put AI to work for your business
Deploy this hardware alongside autonomous AaaS agents that handle tasks end-to-end — no babysitting required.