Amazon has completed a $38B cloud computing deal with OpenAI that will result in ChatGPT and other sophisticated AI applications running on AWS. The acquisition is one of the biggest exits of OpenAI out of Microsoft, which had a monopoly on cloud access. OpenAI will now have direct access to the huge Nvidia GPU clusters on Amazon in the U.S. data centres. The deal puts Amazon in a stronger position to consolidate its cloud dominance because businesses are competing to increase the computing power of their frontier AI applications.
READ ALSO: AWS vs Azure vs Google Cloud: The $1 Trillion Battle to Control the World’s Computing Power
OpenAI Expands Beyond Microsoft With AWS Infrastructure
OpenAI established that the deal takes place over multiple years, and the company will purchase $38B cloud capacity with Amazon Web Services. Reportedly, as part of the setup, OpenAI can start consuming hundreds of thousands of Nvidia graphics processing units (GPUs) on AWS infrastructure instantly. According to the deal-familiar executives, the new capacity will allow OpenAI to expand its systems up to 2026 and later, as the demand for generative AI will keep rising.
Until recently this year, Microsoft was the sole cloud partner of OpenAI, after receiving a stream of investments that amounted to 13 billion dollars. The right of first refusal on new cloud requests expired last week, and this gave Microsoft an opportunity to expand its partnership with cloud providers. Both OpenAI and Google had already initiated an affiliation with Oracle, but AWS is the biggest by market share all over the globe.

In the announcement of OpenAI, the company CEO, Sam Altman, mentioned that to scale up advanced AI, reliable and huge computational power is in order. According to him, the AWS collaboration bolsters the wider compute ecosystem and helps with the goals and objectives of increasing the availability of next-generation AI services.
AWS Gains a High-Profile AI Customer During Competitive Cloud Growth
The announcement of the deal saw Amazon trade higher on Monday by 4 per cent to a record high closing price. The firm has witnessed an influx of investor attention towards its AI-oriented cloud strategy. In the latest earnings, AWS registered over 20 percent revenue growth per year. Nevertheless, Microsoft and Google have been registering increased growth in their cloud faster, and this means that AWS has to defend itself.
The decision to acquire OpenAI as a key customer of the cloud services is an instance that gives Amazon more credence in its long-term AI infrastructure plans. Dave Brown, who is the vice president of compute and machine learning services at AWS, indicated that the initial stage of capacity will be based on existing AWS data centers. Nevertheless, Amazon will also have more infrastructure to create to support OpenAI workloads.
Brown defined the capacity as independent of the overall customer base of AWS and has already ensured that OpenAI is using the existing GPU clusters. The breadth and availability of optimized compute, as mentioned by Matt Garman, the CEO of AWS, prove that AWS can handle heavy workloads associated with AI. He noted that the platform will be able to scale fast and dependably as customers turn to AI.

The Nvidia Hardware is the foundation of the deal, and the expansion can occur at a later date.
The original focus of the bargain revolves around Nvidia hardware. Nvidia Blackwell models of GPUs will be used in OpenAI to train and make inferences. These are the chips that operate the real-time conversational features of ChatGPT and the creation of new AI frontier models. AWS will also take care of the operational availability, thermal control, and scaling of the GPU clusters in several data center regions.
The executives assured that the acquisition will also cover possible growth beyond Nvidia hardware. Amazon has also been designing its own AI processors, such as the Trainium chip, which is deployed today in an 11 billion-dollar AWS data center campus in Indiana, constructed on behalf of Anthropic. Although no information was shared concerning how OpenAI might utilize the Amazon-made silicon, the leaders of AWS stressed the vision of providing a customer with the option of hardware choice.






