Amazon just scored a major coup with the Meta thanks, once again, to Amazon’s homegrown chips. Meta signed an agreement to use millions AWS Graviton chip to power its growing AI needs, Amazon announced on Friday.
Note that AWS Graviton is an ARM-based CPU, (a central processing unit, the chip that handles general computing tasks) not a GPU (a graphics processing unit).
While GPUs remain the chip of choice for training large models, once those models are trained, the AI agents built on top of them cause a shift in the type of chip needed. Agents create compute-intensive workloads such as real-time reasoning, code writing, searching, and coordination involved in managing agents through multi-step tasks. The latest version of AWS’s Graviton was designed specifically to handle computing needs related to artificial intelligence, the company says.
This deal returns more of Meta’s cash to AWS instead of competitors like Google Cloud. Last August, Meta has signed a six-year, $10 billion deal with Google Cloudalthough Meta, until then, was primarily an AWS customer that also used Microsoft Azure.
We couldn’t help but notice that AWS timed the announcement of this deal just as the Google Cloud Next conference was wrapping up, as a virtual smile to its cloud rival. Google, of course, also makes its own custom AI chips and announced new versions of them at the show.
It’s true that Amazon also makes its own AI GPU: Trainium, which, despite its name, is used for both training and inference—the stage that happens after a model is trained, when it actively processes the prompts.
But Anthropic was already on board with a deal announced earlier this month that managed many of those brands for years to come. The Claude maker agreed to spend $100 billion over 10 years to run its workloads on AWS — with a particular focus on Trainium — while Amazon agreed to invest another $5 billion (bringing its total investment to $13 billion) in Anthropic in return.
Techcrunch event
San Francisco, California
|
13-15 October 2026
Ultimately, the Meta deal allows Amazon to showcase a massive AI client as a proof point for its domestic CPUs. These are chips that compete with Nvidia’s new Vera CPU, which is also ARM-based and designed to handle AI agent workloads. The difference, of course, is that Nvidia sells its AI chips and systems to enterprises and cloud providers (including AWS). AWS only sells access to its chips through its cloud service.
Earlier this month, Amazon CEO Andy Jassy took aim at Nvidia and Intel in his annual shareholder letter, saying businesses want better price-performance ratios for artificial intelligence and that he intends to win deals on that basis. It also means the pressure couldn’t be greater on Amazon’s internal chip-making team to deliver, a team we visited last month on an exclusive tour of its lab.
When you purchase through links in our articles, we may earn a small commission. This does not affect our editorial independence.
