A new one-hundred billion dollar partnership between Nvidia and OpenAI, announced on Monday, represents the latest mega-deal reshaping the AI infrastructure landscape. The agreement involves non-voting shares tied to massive chip purchases and provides enough computing power for more than five million United States households. This deepens the relationship between two of the most powerful players in artificial intelligence.
Meanwhile, Google Cloud is placing a different bet entirely. While the industry’s biggest players cement ever-tighter partnerships, Google is focused on capturing the next generation of AI companies before they become too large to court.
Francis deSouza, the Chief Operating Officer of Google Cloud, has seen the AI revolution from multiple vantage points. As the former CEO of genomics giant Illumina, he watched machine learning transform drug discovery. As a co-founder of a two-year-old AI alignment startup called Synth Labs, he has grappled with the safety challenges of increasingly powerful models. Now, having joined the leadership team at Google Cloud in January, he is orchestrating a massive wager on AI’s second wave.
deSouza likes to tell this story in numbers. He noted that nine out of the top ten AI labs use Google’s infrastructure. He also said that nearly all generative AI unicorns run on Google Cloud, that sixty percent of all generative AI startups worldwide have chosen Google as their cloud provider, and that the company has lined up fifty-eight billion dollars in new revenue commitments over the next two years. This represents more than double its current annual run rate.
He stated that AI is resetting the cloud market and that Google Cloud is leading the way, especially with startups. His understated delivery masks an ambitious strategy to focus on upstarts while the biggest players are busy striking lucrative but non-exclusive partnerships.
The Nvidia and OpenAI deal exemplifies the scale of consolidation sweeping AI infrastructure. Microsoft’s original one billion dollar investment in OpenAI has grown to nearly fourteen billion dollars, fundamentally reshaping the cloud market. Amazon followed with an eight billion dollar investment in Anthropic, securing deep hardware customizations that essentially tailor AI training to work better with Amazon’s infrastructure.
Oracle has emerged as a surprise winner, landing a thirty billion dollar cloud deal with OpenAI and then securing a three hundred billion dollar five-year commitment starting in 2027. Even Meta, despite building its own infrastructure, signed a ten billion dollar deal with Google Cloud while planning six hundred billion dollars in United States infrastructure spending through 2028. The Trump administration’s five hundred billion dollar Stargate project, involving SoftBank, OpenAI, and Oracle, adds another layer to these interlocking partnerships.
These gigantic deals might seem threatening for Google, as companies like OpenAI and Nvidia appear to be cementing partnerships elsewhere. It looks a lot like Google is being cut out of some frenzied dealmaking. But the corporate behemoth is not sitting on its hands. Instead, Google Cloud is signing smaller companies like Loveable and Windsurf as primary computing partners without major upfront investments.
deSouza calls these the next generation of companies coming up. The approach reflects both opportunity and necessity. In a market where companies can go from being a startup to a multi-billion dollar company in a very short period of time, capturing future unicorns before they mature could prove more valuable than fighting over today’s giants.
The strategy extends beyond simple customer acquisition. Google offers AI startups three hundred and fifty thousand dollars in cloud credits, access to its technical teams, and go-to-market support through its marketplace. Google Cloud also provides what deSouza describes as a no compromise AI stack, from chips to models to applications, with an open ethos that gives customers choice at every layer.
He said that companies love the fact that they can get access to the AI stack and to technical teams to understand where technologies are going. He added that they also love that they are getting access to enterprise grade, Google class infrastructure.
This infrastructure advantage became more apparent this month when reporting revealed Google’s behind-the-scenes maneuvering to expand its custom AI chip business. Google has struck deals to place its tensor processing units in other cloud providers’ data centers for the first time, including an agreement with London-based Fluidstack that includes up to three point two billion dollars in financial backing for a New York facility.
Competing directly with AI companies while simultaneously providing them infrastructure requires finesse. Google Cloud provides its processing chips to OpenAI and hosts Anthropic’s Claude model through its Vertex AI platform, even as its own Gemini models compete head-to-head with both. Alphabet, Google Cloud’s parent company, also owns a fourteen percent stake in Anthropic according to court documents. When asked directly about the financial relationship, deSouza called it a multi-layered partnership and redirected to Google Cloud’s model garden, noting that customers can access various foundation models.
If Google is trying to be a neutral platform while advancing its own agenda, it has had plenty of practice. The approach has roots in Google’s open-source contributions, from Kubernetes to the foundational Attention is All You Need paper that enabled the transformer architecture underlying most modern AI. More recently, Google published an open-source protocol called Agent-to-Agent for inter-agent communication in an attempt to demonstrate its continued commitment to openness even in competitive areas.
deSouza acknowledged that the company has made the explicit choice over the years to be open at every layer of the stack. He said that they know this means companies can take their technology and use it to build a competitor at the next layer, something that has been happening for decades and that they are okay with.
Google Cloud’s courtship of startups comes at a particularly interesting moment. This month, a federal judge delivered a nuanced ruling in the government’s five-year-old search monopoly case, attempting to curb Google’s dominance without hampering its AI ambitions. While Google avoided the most severe proposed penalties, the ruling underscored regulatory concerns about the company leveraging its search monopoly to dominate AI. Critics worry that Google’s vast trove of search data provides an unfair advantage and that the company could deploy the same monopolistic tactics that secured its search dominance.
In conversation, deSouza is focused on positive outcomes. He outlined a vision where Google Cloud helps power research into Alzheimer’s, Parkinson’s, and climate technologies. He said they want to work very hard to make sure they are pioneering the technologies that will enable that work.
Critics may not be easily assuaged. By positioning itself as an open platform that empowers the next generation of AI companies, Google Cloud may be showing regulators that it fosters competition rather than stifles it. This strategy also forges relationships with startups that might help Google’s case if regulators ramp up pressure.

