In a post on their Google Cloud Platform Blog yesterday, the Alphabet company announced that they have built their own integrated circuit (IC) designed from the ground up with only one application in mind: machine learning. Developed in secret, the Tensor Processing Unit board, or TPU for short, has already been deployed internally at Google for over a year accelerating the computational power behind some of their most popular products including Search and Maps.
Application-specific integrated circuits (ASICs) are nothing new. From chips designed in the 1980s for handling computer graphics to chips designed more recently for Bitcoin mining, ASICs have evolved to the point where many are entire “systems on a chip” having microprocessors, ROM, ROM, EEPROM, and flash memory.
While the custom design and manufacturing comes at a steep cost, the benefits of having hardware custom-tailored to the software that runs on it are often more than worth it as Google can attest to. Using fewer transistors per calculation allows for the TPU to do many more operations per second than a general purposed IC would. As a result Google teams are seeing more than order of magnitude better performance per watt.
Staggeringly, Google says this about the TPU and the performance gain it provides, “This is roughly equivalent to fast-forwarding technology about seven years into the future (three generations of Moore’s Law).”
You can read Google’s full blog post here: “Google supercharges machine learning tasks with TPU custom chip“