Skip to main content
HardwareAI InfrastructurevNVL72

NVIDIA GB200 NVL72

by NVIDIA · paid · Last verified 2026-03-17

Grace Blackwell Superchip combining NVIDIA Grace CPU and B200 GPU on a single module. The NVL72 rack system connects 36 GB200 Superchips via NVLink Switch, delivering unprecedented scale-up AI compute for frontier model training.

https://www.nvidia.com/en-us/data-center/gb200-nvl72/
C+
C+Average
Adoption: C+Quality: A+Freshness: A+Citations: B+Engagement: F

Specifications

License
Proprietary
Pricing
paid
Capabilities
ai-training, inference, fp4-compute, nvlink-switch, rack-scale-compute, cpu-gpu-integration
Integrations
cuda, tensorrt, nccl, cudnn, nvlink-switch
Use Cases
frontier-model-training, large-scale-inference, hpc
API Available
No
Tags
gpu, data-center, training, inference, blackwell, grace-blackwell, rack-scale
Added
2026-03-17
Completeness
100%

Index Score

58.8
Adoption
50
Quality
100
Freshness
98
Citations
75
Engagement
0

Put AI to work for your business

Deploy this hardware alongside autonomous AaaS agents that handle tasks end-to-end — no babysitting required.

Explore the full AI ecosystem on Agents as a Service