Gaming: Openai Reportedly Isn't Happy With Nvidia's Gpus While Nvidia's...
Are the two biggest beasts in AI falling out of love?
OpenAI reportedly isn't happy with the performance of Nvidia's GPUs. Meanwhile, Nvidia is having second thoughts about pumping $100 billion into OpenAI. These are the latest rumours around the two biggest players in AI. So, could their unholy alliance be faltering?
Last week, the Wall Street Journal claimed that Nvidia is rethinking its previously announced plans to invest $100 billion in OpenAI over concerns regarding its ability to compete with the likes of Google and Anthropic.
Then yesterday, Reuters posted a story detailing the reported dissatisfaction of OpenAI with Nvidia's GPUs, specifically for the task of inferencing AI models. If the latter story looks a lot like somebody at OpenAI hitting back at the original Wall Street Journal claims, the two narratives combined feel like just the sort of tit-for-tat off-the-record briefing that occurs when an alliance is beginning to falter.
For now, none of this is official. It's all rumour. However, it is true that Nvidia's intention to invest $100 billion in OpenAI was announced in September and has yet to be finalised.
The Wall Street Journal claims that Nvidia CEO Jensen Huang has "privately criticized what he has described as a lack of discipline in OpenAI’s business approach and expressed concern about the competition it faces from the likes of Google and Anthropic."
In public, Huang has defended Nvidia's intentions when it comes to investments in OpenAI, but has stopped short of explicitly reconfirming the $100 billion deal. “We will invest a great deal of money, probably the largest investment we’ve ever made,” he said. But he also retorted, "no, no, nothing like that," when queried whether that investment would top $100 billion.
As for OpenAI, Reuters says that it is, "unsatisfied with some of Nvidia’s latest artificial intelligence chips, and it has sought alternatives since last year." It's claimed that OpenAI is shifting its emphasis away from training AI in favour of inference or running AI models as services for customers.
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
It's for that latter task, inference, that OpenAI is said to have found Nvidia's GPUs wanting. "Seven sources said that OpenAI is not satisfied with the speed at which Nvidia’s hardware can spit out answers to ChatGPT users for specific types of problems such as software development and AI communicating with other
Source: PC Gamer