The $100 billion partnership between Nvidia and OpenAI, introduced Monday, represents – for now – the most recent mega-deal reshaping the AI infrastructure panorama. The settlement includes non-voting shares tied to huge chip purchases and sufficient computing energy for greater than 5 million U.S. households, deepening the connection between two of AI’s strongest gamers.
In the meantime, Google Cloud is putting a unique wager completely. Whereas the business’s largest gamers cement ever-tighter partnerships, Google is hellbent on capturing the following era of AI firms earlier than they grow to be too massive to courtroom.
Francis deSouza, its COO, has seen the AI revolution from a number of vantage factors. As the previous CEO of genomics large Illumina, he watched machine studying remodel drug discovery. As co-founder of a two-year-old AI alignment startup, Synth Labs, he has grappled with the protection challenges of more and more highly effective fashions. Now, having joined the C-suite at Google Cloud in January, he’s orchestrating an enormous wager on AI’s second wave.
It’s a narrative deSouza likes to inform in numbers. In a dialog with this editor earlier this week, he famous a number of occasions that 9 out of the highest 10 AI labs use Google’s infrastructure. He additionally stated that just about all generative AI unicorns run on Google Cloud, that 60% of all gen AI startups worldwide have chosen Google as their cloud supplier, and that the corporate has lined up $58 billion in new income commitments over the following two years, which represents greater than double its present annual run fee.
Requested what proportion of Google Cloud’s income comes from AI firms, he gives as a substitute that “AI is resetting the cloud market, and Google Cloud is main the best way, particularly with startups.”
The Nvidia-OpenAI deal exemplifies the size of consolidation sweeping AI infrastructure. Microsoft’s authentic $1 billion OpenAI funding has grown to almost $14 billion, basically reshaping the cloud market. Amazon adopted with $8 billion in Anthropic investments, securing deep {hardware} customizations that primarily tailor AI coaching to work higher with Amazon’s infrastructure. Oracle has emerged as a shock winner, too, touchdown a $30 billion cloud cope with OpenAI after which securing a jaw-dropping $300 billion five-year dedication beginning in 2027.
Even Meta, regardless of constructing its personal infrastructure, signed a $10 billion deal with Google Cloud whereas planning $600 billion in U.S. infrastructure spending by 2028. The Trump administration’s $500 billion “Stargate” undertaking, involving SoftBank, OpenAI and Oracle, provides one other layer to those interlocking partnerships.
Techcrunch occasion
San Francisco
|
October 27-29, 2025
These gigantic offers might sound threatening for Google, given the partnerships that firms like OpenAI and Nvidia look like cementing elsewhere. In reality, it appears quite a bit like Google is being lower out of some frenzied dealmaking.

However the company behemoth isn’t precisely sitting on its palms. As a substitute, Google Cloud is signing smaller firms like Loveable and Windsurf — what deSouza calls the “subsequent era of firms developing”– as “main computing companions” with out main upfront investments.
The method displays each alternative and necessity. In a market the place firms can go “from being a startup to being a multi-billion greenback firm in a really brief time frame,” as deSouza places it, capturing future unicorns earlier than they mature might show extra invaluable than preventing over as we speak’s giants.
The technique extends past easy buyer acquisition. Google gives AI startups $350,000 in cloud credit, entry to its technical groups, and go-to-market help by its market. Google Cloud additionally supplies what deSouza describes as a “no compromise” AI stack – from chips to fashions to purposes – with an “open ethos” that provides clients alternative at each layer.
“Firms love the truth that they will get entry to our AI stack, they will get entry to our groups to grasp the place our applied sciences are going,” deSouza stated throughout our interview. “Additionally they love that they’re gaining access to enterprise grade Google class infrastructure.”
This infrastructure benefit grew to become extra obvious this month when reporting revealed Google’s behind-the-scenes maneuvering to develop its customized AI chip enterprise. In accordance with The Data, Google has struck offers to put its tensor processing items (TPUs) in different cloud suppliers’ information facilities for the primary time, together with an settlement with London-based Fluidstack that features as much as $3.2 billion in monetary backing for a New York facility.
Competing straight with AI firms whereas concurrently offering them infrastructure requires finesse. Google Cloud supplies TPU chips to OpenAI and hosts Anthropic’s Claude mannequin by its Vertex AI platform, at the same time as its personal Gemini fashions compete head-to-head with each. (Google Cloud’s father or mother firm, Alphabet, additionally owns a 14% stake in Anthropic, per New York Instances courtroom paperwork obtained earlier this yr, although when requested straight about Google’s monetary relationship with Anthropic, deSouza calls the connection a “multi-layered partnership” then shortly redirected to Google Cloud’s “mannequin backyard” – noting that clients can entry numerous basis fashions.)
But when Google is making an attempt to be Switzerland whereas advancing its personal agenda, it has had loads of follow. The method has roots in Google’s open-source contributions, from Kubernetes to the foundational “Consideration is All You Want” paper that enabled the transformer structure underlying most fashionable AI. Extra just lately, Google revealed an open-source protocol known as Agent-to-Agent (A2A) for inter-agent communication in an try and display its continued dedication to openness even in aggressive areas.
“We’ve made the specific alternative over time to be open at each layer of the stack, and we all know that this implies firms can completely take our know-how and use it to construct a competitor on the subsequent layer,” deSouza acknowledged. “That’s been taking place for many years. That’s one thing we’re okay with.”
Google Cloud’s courtship of startups comes at a very fascinating second. Simply this month, federal decide Amit Mehta delivered a nuanced ruling within the authorities’s five-year-old search monopoly case, making an attempt to curb Google’s dominance with out hampering its AI ambitions.
Whereas Google averted the Justice Division’s most extreme proposed penalties, together with the pressured divestment of its Chrome browser, the ruling underscored regulatory issues in regards to the firm leveraging its search monopoly to dominate AI. Critics are apprehensive, understandably, that Google’s huge trove of search information supplies an unfair benefit in creating AI techniques, and that the corporate might deploy the identical monopolistic techniques that secured its search dominance.
In dialog, deSouza is targeted on way more constructive outcomes. “I feel we’ve a possibility to basically perceive among the main illnesses that as we speak we simply don’t have a very good understanding of,” deSouza stated, for instance, outlining a imaginative and prescient the place Google Cloud helps energy analysis into Alzheimer’s, Parkinson’s, and local weather applied sciences. “We wish to work very onerous to be sure that we’re pioneering the applied sciences that can allow that work.”
Critics might not simply be assuaged. By positioning itself as an open platform that empowers quite than controls the following era of AI firms, Google Cloud could also be displaying regulators that it fosters competitors quite than stifles it, all whereas forging relationships with startups that may assist Google’s case if regulators ramp up stress.
For our full dialog with deSouza, take a look at this week’s StrictlyVC Obtain podcast; a brand new episode comes out each Tuesday.