OpenAI CEO Sam Altman and Nvidia CEO Jensen Huang seem to be going all out publicly to prove that all is well between the two companies. However, reports after reports say that all is not well between the two AI giants. According to a report in Reuters, OpenAI is unsatisfied with some of Nvidia’s latest artificial intelligence (AI) chips, and it has sought alternatives since last year. The report quotes as many as eight sources familiar with the matter, potentially complicating the relationship between the two companies.Out of these eight sources, seven sources reportedly said that OpenAI is not satisfied with the speed at which Nvidia’s hardware can spit out answers to ChatGPT users for specific types of problems such as software development and AI communicating with other software. The company needs new hardware that would eventually provide about 10% of OpenAI’s inference computing needs in the future, one of the sources told Reuters. This comes with what analysts see as a shift in ChatGPT-maker’s strategy, which is the company’s increasing emphasis on chips used to perform specific elements of AI inference, the process when an AI model responds to customer queries and requests.
What may be causing tension between OpenAI and Nvidia
As Reuters report says, “Nvidia’s graphics processing chips are well-suited for massive data crunching necessary to train large AI models like ChatGPT that have underpinned the explosive growth of AI globally to date. But AI advancements increasingly focus on using trained models for inference and reasoning, which could be a new, bigger stage of AI, inspiring OpenAI’s efforts. The ChatGPT-maker’s search for GPU alternatives since last year focused on companies building chips with large amounts of memory embedded in the same piece of silicon as the rest of the chip, called SRAM. Squishing as much costly SRAM as possible onto each chip can offer speed advantages for chatbots and other AI systems as they crunch requests from millions of users.” Last year, OpenAI struck deals with AMD and others for GPUs built to rival Nvidia’s. In fact, sources say OpenAI’s deal with AMD didn’t go very well with partner Nvidia. This AI inference has become the new front in the competition. This decision by OpenAI and others to seek out alternatives in the inference chip market marks a significant test of Nvidia’s AI dominance and comes at a time when the two companies are in investment talks. In September, Nvidia said it intended to pour as much as $100 billion into OpenAI as part of a deal that gave the chipmaker a stake in the startup and gave OpenAI the cash it needed to buy the advanced chips. In a joint announcement unveiling the September deal with Sam Altman, OpenAI President Greg Brockman and Jensen Huang called the deal “the largest computing project in history.” The deal was expected to close within weeks, however, negotiations have dragged on for months. New reports claim that Nvidia is now planning to halve its investments.According to a recent Wall Street Journal report, Nvidia CEO Jensen Huang has privately criticized OpenAI for what he has described as a lack of discipline in OpenAI’s business approach and expressed concern about the competition it faces from the likes of Google and Anthropic.In November, Nvidia said that it was committing to invest up to $10 billion into Anthropic. In a filing in the same month, Nvidia reportedly said that there was no assurance that it would “enter into definitive agreements with respect to the OpenAI opportunity or other potential investments, or that any investment will be completed on expected terms, if at all.”
Sam Altman and Jensen Huang deny any rift, but ..
In the past few days both OpenAI CEO Sam Altman and Nvidia CEO Jensen Huang have strongly denied reports of any tension between the two companies. Last week, Nvidia CEO Jensen Huang brushed off a report of tension with OpenAI, saying the idea was “nonsense” and that Nvidia planned a huge investment in OpenAI. A spokesperson for OpenAI in a separate statement said that the company relies on Nvidia to power the vast majority of its inference fleet and that Nvidia delivers the best performance per dollar for inference.OpenAI Chief Executive Sam Altman too wrote in a post on Twitter that Nvidia makes “the best AI chips in the world” and that OpenAI hoped to remain a “gigantic customer for a very long time”.

