According to a source close to the matter, OpenAI has started renting Google's AI chips to power its products, including ChatGPT. This move marks a significant shift for OpenAI, a major con…
According to a source close to the matter, OpenAI has started renting Google's AI chips to power its products, including ChatGPT. This move marks a significant shift for OpenAI, a major consumer of Nvidia's GPUs, as it expands its computing capacity to meet growing demands. The collaboration between OpenAI and Google, two prominent competitors in the AI sector, is a surprising development, highlighting the dynamic landscape of the AI industry.
OpenAI's decision to utilize Google's AI chips, known as tensor processing units (TPUs), signals a move away from solely relying on Nvidia and its backer Microsoft's data centers. This move could potentially establish TPUs as a more cost-effective alternative to Nvidia's GPUs, particularly for inference computing, where AI models use their training to make predictions.
OpenAI hopes that using Google's TPUs, accessible through Google Cloud, will help reduce the cost of inference. However, it's important to note that Google, a direct competitor in the AI race, is not providing its most powerful TPUs to OpenAI. For Google, this deal represents an opportunity to expand the external availability of its in-house TPUs, which were previously reserved for internal use.
This strategy has already attracted customers like Apple, Anthropic, and Safe Superintelligence. The collaboration underscores Google's efforts to capitalize on its AI technology, from hardware to software, to accelerate the growth of its cloud business. The partnership between OpenAI and Google reflects the evolving nature of the AI landscape.
It shows that even competitors are open to collaboration to meet the increasing demands for computing power. This development may also introduce more competition in the AI chip market, potentially benefiting companies by offering more choices and driving down costs.