Amazon is cooking up its own AI chips to cut costs and break free from Nvidia’s grip. Their brand new chip Trainium 2 drops in next month with big names such as Anthropic and Databricks already hopping on to test it. Amazon claims its Inferentia chips will save 40% on AI model costs, which makes it ready for big-time massive budgets. They’re pouring billions into tech, but with Nvidia’s still being the top dog, with its rank staying as much as Amazon’s entire AWS division.
Amazon is busy on developing its new artificial intelligence chips which is supposed to boost returns on its semiconductor investments and result in overall reduce in dependency on Nvidia. Its development is being led by Annapurna Labs, it acquired this lab in 2025 for $350 million.
Amazon says, this chip development aims to improve data centers efficiency and offers customers with tailored options in the cloud AI market. It will do so by optimizing chips for specific tasks, unlike Nvidia’s which focuses more on general purpose tools. Amazon claims their “Inferentia AI chips” to be 40% cheaper to run for general AI models response generation. This may actually not sound like it matters for small stuffs but is actually huge for budgets in the millions. They are also spending a lot, $75B on tech in 2024. Additionally, this is still lot more considering their 2023 budget, and it is probably going to have gone up even further the next year. According to an Annapurna official, maintaining smooth operations requires developing entire systems from the ground up rather than merely designing chips.
You might also like: AI Firm Genius Group Chooses Bitcoin as Primary Treasury Asset
Amazon has made numerous tries, but it has yet to match Nvidia’s dominance in AI chips. According to reports, Nvidia made $26.3 billion in sales from AI data center chips in just its second fiscal quarter of 2024, which is equal to the whole revenue for Amazon’s AWS division during the same time frame.