Sina Tech News, Beijing Time, 8 May morning news, according to TechCrunch, the US technology media, Google announced today that its second and third generation Cloud TPU Pod will be available to the public in beta form.
These two cloud-based supercomputers will be equipped with up to 1000 custom TPUs. The latest V3 model is particularly powerful and equipped with a liquid-cooled system. Each device can provide up to 100 trillion floating-point operations per second.
According to Google, the device can rank among the top five supercomputers in the world based on its original computing power, but because TPU's numerical accuracy is very low, it should be considered at a discount.
But users don't have to use the full TPU Pod, and Google allows developers to rent some of its power. However, the device is powerful enough to train a standard ResNet-50 image classification model using ImageNet datasets in as little as two minutes.
TPU V2 Pod is equipped with 512 cores at most, and its operation speed is slightly slower than v3. For example, when using 265 TPUs, V2 Pod could train the ResNet-50 model in 11.3 minutes, and v3od only needed 7.1 minutes. It takes 302 minutes to use a TPU. (Ding Hong)