Google's Cloud TPU is ready for training and inferencing
by from Techreport on (#2PV9N)
The world of big iron computing seems to be laser-focused on machine learning these days. Whether it's graphics chip makers Nvidia and AMD producing silicon exclusively for machine learning, or search engine Bing using custom FPGAs to accelerate repetitive mathematical operations, every technology company these days seems to have an AI accelerator strategy.
Google isn't resting on its laurels, either. At its I/O conference today, the company introduced a second-generation version of its Tensor Processor Unit from a year ago , called the Cloud TPU. Google can combine multiple Cloud TPUs into four-chip clusters, and the cluster pictured below offers up to a claimed 180 TFLOPS of floating-point capability ...