OpenAI is in discussions with Amazon about a potential investment and a separate agreement to use Amazon’s artificial intelligence chips, according to a CNBC report confirmed Tuesday. The details are still fluid, but one person familiar with the talks said the investment could exceed $10 billion.
If the talks turn into a signed deal, it would be less about branding and more about plumbing. The generative AI boom is colliding with physical limits, chip supply, data center buildouts, and the power required to keep models training and serving at scale.
The timing is not accidental. OpenAI completed a restructuring in October and publicly outlined updated terms in its partnership with Microsoft, a move that gives OpenAI more freedom to raise capital and partner across the broader AI ecosystem. Crucially, OpenAI says Microsoft no longer has a right of first refusal to be its compute provider, meaning OpenAI can source large chunks of infrastructure elsewhere without the old gatekeeping.
That matters because OpenAI’s costs are now infrastructure-shaped. In October, OpenAI finalized a secondary share sale totaling $6.6 billion that valued the company at $500 billion, a number that sets the baseline for any new mega-round rather than a future milestone.
Amazon’s interest is easy to explain. AWS is already the market leader in cloud infrastructure, but Microsoft’s early OpenAI alignment helped Azure own the enterprise AI storyline. Bringing OpenAI closer would give AWS a marquee workload and, just as importantly, a forcing function for customers to take Amazon’s silicon seriously.
A chip agreement is the tell. AWS has been designing custom AI chips for years, it announced Inferentia in 2018, and it introduced the latest generation of Trainium chips earlier this month. Getting OpenAI to adopt Amazon chips would be a strategic win in the hardware layer that underpins cloud AI services. CNBC reported on the recent discussions.
OpenAI and Amazon already have a major commercial relationship. Last month, OpenAI signed a deal to buy $38 billion worth of capacity from AWS, its first contract of that scale with the world’s largest cloud provider. If Amazon adds an investment on top, it would fit the emerging pattern of “circular” AI deals, where the infrastructure provider becomes both financier and supplier.
The competitive context is equally sharp. Amazon has invested at least $8 billion into Anthropic, an OpenAI rival, while Microsoft and Nvidia have also signaled major commitments into that same company. The market is converging on a simple reality: the biggest AI outcomes are increasingly decided by who can bankroll compute, not just who can demo the best model.
For readers trying to understand what is actually shifting here, it helps to think in cloud primitives rather than product names. If you want a plain-English refresher on the platforms at stake, start with Google Cloud vs AWS vs Azure, then zoom in on what AWS is and what Azure is.
One final nuance is worth keeping clean. An AWS partnership does not automatically mean Amazon gets to resell or freely package OpenAI’s most advanced models across its entire stack. Commercial rights, product carve-outs, and existing commitments can restrict distribution even when compute and chips diversify behind the scenes.
If this deal materializes, the headline will look like another Big Tech alliance. The deeper story is that AI is maturing into an infrastructure industry, and the leverage is shifting toward whoever owns the chips, the clusters, and the power contracts that make the models possible. Read the announcement on Microsoft/OpenAI from late October.