Then, as those commitments and the scale of funding required to meet them ratcheted up, they began raising debt, some of it off the balance sheet, to underwrite the trillions of dollars of spending they have committed to.
Loading
That’s a pretty fragile industry structure, which revolves around Nvidia and OpenAI at its centre.
On Monday, OpenAI’s chief executive – in a memo to staff first reported by the technology-focused online publication The Information – declared a “code red,” accelerating efforts to improve the quality of its ChatGPT chatbot while deferring development of a range of products – advertising, shopping, health and other apps – that it had hoped would generate more revenue from ChatGPT.
That sudden note of urgency came after it became evident that OpenAI is facing serious competition for the first time in the three years since it launched ChatGPT and started the frenzy around AI.
As it happens that competition also threats to erode Nvidia’s dominance and the massive margins – a gross margin of around 70 per cent – it has conferred.
The challenger? Alphabet’s Google.
Late last month, The Information revealed that Google was discussing plans to sell its own internally developed AI chips to Meta in a multibillion-dollar deal, weakening Nvidia’s hold on the AI hardware sector. Google had also quite recently released its chatbot, Gemini 3, which was widely considered to be able to outperform OpenAI’s ChatGPT 5.
Thus Google had emerged as a threat to both Nvidia and its biggest customers. Google is also thought to be one of the four hyperscalers who between them account for more than 60 per cent of Nvidia’s revenue, so the threat posed by Google’s development of its own source of chips has more than one layer.
It’s not just Google. Amazon – the world’s largest provider of cloud services – has only just started installing its latest chip, Tranium3, in data centres.
Neither Google’s TPU’s (Tensor Processing Units) nor Amazon’s chip are as flexible or versatile as Nvidia’s GPUs (Graphics Processing Units), which is why it was able to grab dominance of the training of AI’s large language models.
Loading
Where the challengers’ strengths lie is in inference, or where the AI model actually deploys the knowledge that has been poured into it, which is where the market for AI is shifting.
While they can’t do as much as Nvidia’s chips, what they can do they do well and far more cheaply and faster, with less energy consumed, than Nvidia’s chips. It’s also the case – as the negotiations with Meta demonstrate – that there is an appetite within the sector for competing sources of supply to dilute the risks and costs of the dependence on Nvidia.
Nvidia was so unsettled by the sudden emergence of Google as a threat that it did something out of character and posted a defence of its position on X, saying it was delighted by Google’s success, but “Nvidia is a generation ahead of the industry – it’s the only platform that runs every AI model and does it everywhere computing is done.”
Google has serious firepower to fund its ambitions. Now it is firmly in the contest to be one of what’s likely to be a very small number of very large players that emerge as the dominant forces in AI.
It has no debt, indeed it has net cash of around $US100 billion and around $US150 billion a year of cashflows to fund its ambitions. OpenAI, for instance, has a revenue run-rate of only about $US20 billion, hence its reliance on equity raisings, vendor financing and debt.
Google also has its “stack,” or a vertically integrated suite of products –platforms, infrastructure, hardware and software – that can be deployed to boost the take-up of Gemini.
Gemini 2 was launched a year ago. By May this year, it had about 400 million monthly active users. Today it has more than 650 million. ChatGPT has about 800 million weekly active users – it remains dominant, but the gap is closing.
Google’s sudden acceleration into the AI space, coupled with the efforts of all hyperscalers to diversify their supplier base, develop their own chips and reduce their reliance on Nvidia, doesn’t mean that Nvidia and OpenAI can’t also survive and still be there when the sector has been rationalised, as it inevitably will be.
Loading
It could, however, make their pathway through the blizzard of outsized investments they have locked themselves into more difficult, particularly for OpenAI, by fragmenting their potential customer bases, reducing their potential revenues and increasing their costs of capital.
It’s most challenging for OpenAI, which will need to invest even more to maintain its sectoral leadership while not being able to chase as urgently the new sources of revenue it hoped to tap to help shore up its finances. Nvidia will likely remain the dominant chip provider to the sector, but competition will put some pressure on its margins and growth rate.
It’s potentially a game-changing moment for AI, one that some have dubbed a “DeepSeek moment,” a reference to the unveiling of a Chinese open-source chatbot built at a fraction of ChatGPT’s costs in January that shocked the sector.
The Business Briefing newsletter delivers major stories, exclusive coverage and expert opinion. Sign up to get it every weekday morning.