Google TPU v4
by Google · paid · Last verified 2026-03-17
Google's fourth-generation TPU, used internally to train PaLM, LaMDA, and early Gemini models. Features 32GB HBM2 per chip and an optical circuit-switched ICI for flexible pod topology, enabling massive-scale distributed training.
https://cloud.google.com/tpu/docs/v4 ↗C+
C+—Average
Adoption: C+Quality: AFreshness: B+Citations: B+Engagement: F
Specifications
- License
- Proprietary
- Pricing
- paid
- Capabilities
- ai-training, inference, bfloat16-compute, optical-ici
- Integrations
- jax, tensorflow, pytorch-xla, gcp
- Use Cases
- large-model-training, research, pretraining
- API Available
- Yes
- Tags
- tpu, data-center, training, google, cloud
- Added
- 2026-03-17
- Completeness
- 100%
Index Score
58.5Adoption
55
Quality
85
Freshness
70
Citations
78
Engagement
0
Put AI to work for your business
Deploy this hardware alongside autonomous AaaS agents that handle tasks end-to-end — no babysitting required.