Google is making its own Arm-based CPU to support its AI work in data centers and introducing a more powerful version of its Tensor Processing Units (TPU) AI chips. Google’s new Arm-based CPU, dubbed Axion, will be used to support Google’s AI workloads before it rolls out to business customers of Google Cloud “later this year.”

The Axion chips are already powering YouTube ads, the Google Earth Engine, and other Google services. “We’re making it easy for customers to bring their existing workloads to Arm,” says Mark Lohmeyer, Google Cloud’s vice president and general manager of compute and machine learning infrastructure, in a statement to Reuters. “Axion is built on open foundations but customers using Arm anywhere can easily adopt Axion without re-architecting or re-writing their apps.”

Google says customers will be able to use its Axion CPU in cloud services like Google Compute Engine, Google Kubernetes Engine, Dataproc, Dataflow, Cloud Batch, and more. Reuters reports that the Axion Arm-based CPU will also offer 30 percent better performance than “general-purpose Arm chips” and 50 percent more than Intel’s existing processors.

Google’s Axion Arm-based CPU.
Image: Google

Google is also updating its TPU AI chips that are used as alternatives to Nvidia’s GPUs for AI acceleration tasks. “TPU v5p is a next-generation accelerator that is purpose-built to train some of the largest and most demanding generative AI models,” says Lohmeyer. A single TPU v5p pod contains 8,960 chips, which is more than double the amount of chips found on the TPU v4 pod.

Google’s announcement of an Arm-based CPU comes months after Microsoft revealed its own custom silicon chips designed for its cloud infrastructure. Microsoft has built its own custom AI chip to train large language models and a custom Arm-based CPU for cloud and AI workloads. Amazon has also offered Arm-based servers for years through its own custom CPU, with the latest workloads able to use Graviton3 servers on AWS.

Google won’t be selling these chips to customers, instead making them available for cloud services that businesses can rent and use. “Becoming a great hardware company is very different from becoming a great cloud company or a great organizer of the world’s information,” says Amin Vahdat, the executive in charge of Google’s in-house chip operations, in a statement to The Wall Street Journal.

Google, like Microsoft and Amazon before it, can now reduce its reliance on partners like Intel and Nvidia, while also competing with them on custom chips to power AI and cloud workloads.

Share.
Exit mobile version