Google Cloud on Tuesday joined AWS and Azure in announcing its first custom Arm processor, called Axion. Based on Arm’s Neoverse 2 designs, Google says its Axion instances offer 30% better performance than other Arm-based instances from competitors like AWS and Microsoft, and up to 50% better performance and 60% better energy efficiency from comparable X86-based instances.
Google provided no documentation to support these claims and, llike us, you’d probably like to know more about these brands. We asked several questions, but Google politely declined to provide additional information. No availability dates, no pricing, no additional technical details. These “benchmark” results? The company wouldn’t even say what X86 case it compared the Axion to.
“Technical documentation, including benchmark and architecture details, will be available later this year,” said Google spokeswoman Amanda Lam.
Maybe the chips aren’t ready yet? After all, it took Google a while to announce Arm-chips in the cloud, especially when you consider that Google has long built its own in-house TPU AI chips and, more recently, custom Arm-based mobile chips for phones by Pixel. AWS launched Graviton chips in 2018.
To be fair, though, Microsoft only announced Cobalt Arm chips late last year, and those chips aren’t yet available to customers. But Microsoft Azure has offered instances based on Ampere’s Arm servers since 2022.
In a press briefing ahead of Tuesday’s announcement, Google emphasized that since Axion is built on an open foundation, Google Cloud customers will be able to bring their existing Arm workloads to Google Cloud without any modifications. This is really no surprise. Anything else would be a very foolish move on Google Cloud’s part.
“We recently contributed to SystemReady virtual environment, which is Arm’s hardware and firmware interoperability standard that ensures common operating systems and software packages can run seamlessly on ARM-based systems,” explained Mark Lohmeyer, Google Cloud VP of Computing Infrastructure and AI/ ML. “Through this partnership, we have access to a broad ecosystem of cloud customers who have already deployed ARM-based workloads across hundreds of ISVs and open source projects.”
More later this year.