The $400 billion issue with AI: Are chips aging too quickly?
The tech industry has spent nearly $400 billion this year on AI chips and data centres, but analysts are increasingly questioning whether this massive investment is sustainable.
Experts say companies are being too optimistic about how long these expensive AI chips will last before becoming outdated. Previously, cloud companies assumed their chips and servers would work for around six years. But according to Mihir Kshirsagar from Princeton University, rapid wear-and-tear and fast technological upgrades make this estimate unrealistic.
A major issue is the speed of new chip releases. Nvidia, the industry leader, launched its Blackwell chip less than a year ago and has already announced the next-generation Rubin chip for 2026, which will be 7.5 times faster. As a result, older chips lose 85–90% of their value within three to four years, analysts say.
AI chips are also failing more frequently because they run extremely hot. A Meta study found a 9% annual failure rate in chips used for its Llama AI model.
Investor Michael Burry, known from The Big Short, has called the situation “fraud,” warning of a potential AI bubble.
If companies are forced to shorten the depreciation period of AI chips to just 2–3 years, their profits could drop sharply. Analysts worry that companies heavily dependent on AI — like Oracle and CoreWeave — face the biggest risks because they need to keep borrowing money to build data centres. Some loans are even backed by the chips themselves.
Big companies like Amazon, Google and Microsoft are expected to handle the pressure better due to their diverse businesses.
To reduce losses, some firms are trying to resell older chips or use them for less demanding tasks.