A new one-hundred billion dollar partnership between Nvidia and OpenAI, announced on Monday, represents the latest mega-deal reshaping the AI infrastructure landscape. The agreement involves non-voting shares tied to massive chip purchases and provides enough computing power for more than five million United States households. This deal deepens the relationship between two of the most powerful players in artificial intelligence.
Meanwhile, Google Cloud is placing a different bet entirely. While the industry’s biggest players cement ever-tighter partnerships, Google Cloud is focused on capturing the next generation of AI companies before they become too large to court.
Francis deSouza, the Chief Operating Officer of Google Cloud, has seen the AI revolution from multiple vantage points. As the former CEO of genomics giant Illumina, he watched machine learning transform drug discovery. As a co-founder of a two-year-old AI alignment startup called Synth Labs, he has grappled with the safety challenges of increasingly powerful models. Now, having joined the leadership team at Google Cloud in January, he is orchestrating a massive wager on AI’s second wave.
deSouza likes to tell this story in numbers. He notes that nine out of the top ten AI labs use Google’s infrastructure. He also says that nearly all generative AI unicorns run on Google Cloud, that sixty percent of all generative AI startups worldwide have chosen Google as their cloud provider, and that the company has lined up fifty-eight billion dollars in new revenue commitments over the next two years. This figure represents more than double its current annual run rate.
When asked what percentage of Google Cloud’s revenue comes from AI companies, he offers instead that AI is resetting the cloud market, and Google Cloud is leading the way, especially with startups.
The Nvidia-OpenAI deal exemplifies the scale of consolidation sweeping AI infrastructure. Microsoft’s original one billion dollar investment in OpenAI has grown to nearly fourteen billion dollars. Amazon followed with an eight billion dollar investment in Anthropic, securing deep hardware customizations that essentially tailor AI training to work better with Amazon’s infrastructure. Oracle has emerged as a surprise winner, landing a thirty billion dollar cloud deal with OpenAI and then securing a three hundred billion dollar five-year commitment starting in 2027.
Even Meta, despite building its own infrastructure, signed a ten billion dollar deal with Google Cloud while planning six hundred billion dollars in United States infrastructure spending through 2028. The Trump administration’s five hundred billion dollar Stargate project, involving SoftBank, OpenAI, and Oracle, adds another layer to these interlocking partnerships.
These gigantic deals might seem threatening for Google, given the partnerships that companies like OpenAI and Nvidia appear to be cementing elsewhere. In fact, it looks a lot like Google is being cut out of some frenzied dealmaking.
But the corporate behemoth is not sitting on its hands. Instead, Google Cloud is signing smaller companies like Loveable and Windsurf, what deSouza calls the next generation of companies coming up, as primary computing partners without major upfront investments.
The approach reflects both opportunity and necessity. In a market where companies can go from being a startup to being a multi-billion dollar company in a very short period of time, capturing future unicorns before they mature could prove more valuable than fighting over today’s giants.
The strategy extends beyond simple customer acquisition. Google offers AI startups three hundred and fifty thousand dollars in cloud credits, access to its technical teams, and go-to-market support through its marketplace. Google Cloud also provides what deSouza describes as a no compromise AI stack, from chips to models to applications, with an open ethos that gives customers choice at every layer.
Companies appreciate the access to the AI stack and the teams to understand where the technologies are going. They also value getting access to enterprise grade, Google class infrastructure.
Google’s infrastructure play got even more ambitious recently with reporting that revealed the company’s behind-the-scenes maneuvering to expand its custom AI chip business. Google has struck deals to place its tensor processing units in other cloud providers’ data centers for the first time, including an agreement with London-based Fluidstack that includes up to three point two billion dollars in financial backing for a New York facility.
Competing directly with AI companies while simultaneously providing them infrastructure requires finesse. Google Cloud provides its chips to OpenAI and hosts Anthropic’s Claude model through its Vertex AI platform, even as its own Gemini models compete head-to-head with both. Google Cloud’s parent company, Alphabet, also owns a fourteen percent stake in Anthropic, according to court documents. When asked directly about the financial relationship with Anthropic, deSouza called it a multi-layered partnership and redirected to Google Cloud’s model marketplace, noting that customers can access various foundation models.
If Google is trying to be a neutral platform while advancing its own agenda, it has had plenty of practice. The approach has roots in Google’s open-source contributions, from Kubernetes to the foundational Attention is All You Need paper that enabled the transformer architecture underlying most modern AI. More recently, Google published an open-source protocol called Agent-to-Agent for inter-agent communication in an attempt to demonstrate its continued commitment to openness even in competitive areas.
deSouza acknowledges that the company has made the explicit choice over the years to be open at every layer of the stack, knowing that companies can take the technology and use it to build a competitor. He states that this has been happening for decades and is something they are okay with.
Google Cloud’s courtship of startups comes at a particularly interesting moment. Just this month, a federal judge delivered a nuanced ruling in the government’s five-year-old search monopoly case, attempting to curb Google’s dominance without hampering its AI ambitions.
While Google avoided the Justice Department’s most severe proposed penalties, including the forced divestment of its Chrome browser, the ruling underscored regulatory concerns about the company leveraging its search monopoly to dominate AI. Critics are understandably worried that Google’s vast trove of search data provides an unfair advantage and that the company could deploy the same monopolistic tactics that secured its search dominance.
In conversation, deSouza is focused on more positive outcomes. He outlines a vision where Google Cloud helps power research into Alzheimer’s, Parkinson’s, and climate technologies, stating they have an opportunity to fundamentally understand major diseases that are not well understood today. He says they want to work very hard to pioneer the technologies that will enable that work.
Critics may not be easily assuaged. By positioning itself as an open platform that empowers the next generation of AI companies, Google Cloud may be showing regulators that it fosters competition rather than stifles it. This strategy also forges relationships with startups that might help Google’s case if regulators ramp up pressure.

