Oracle Cloud Infrastructure (OCI) is bringing NVIDIA’s Blackwell Extremely GPUs to its cloud platform, a transfer introduced on the GTC 2025 AI convention. Whereas this expands OCI’s capabilities, it additionally calls for new infrastructure options, akin to implementing liquid cooling in its knowledge centres. However it comes with its personal challenges.
“Most knowledge centres aren’t prepared for liquid cooling,” mentioned Karan Batta, senior vice chairman at OCI, in an unique interview with AIM, acknowledging the complexity of managing the warmth produced by the brand new technology of GPUs.
He added that cloud suppliers should select between passive or lively cooling, full-loop techniques, or sidecar approaches to combine liquid cooling successfully. Batta additional famous that whereas server racks observe a typical design (and may be copied from NVIDIA’s setup), the actual complexity lies in knowledge centre design and networking.
Batta defined that at present, each cloud supplier basically buys a rack from NVIDIA. “The differentiation comes from the info centre design—how sizzling you’ll be able to run these GPUs and the way a lot you’ll be able to scale them,” he mentioned, including that making certain the best uptime and minimising failures is important.
“The most important problem will not be deploying the GPUs—anybody can do this—however truly managing and working an enormous GPU cluster,” Batta mentioned.
Constructed on the Blackwell structure launched final yr, Blackwell Extremely options the NVIDIA GB300 NVL72 rack-scale resolution and the NVIDIA HG B300 NVL16 system. The GB300 NVL72 delivers 1.5 occasions the AI efficiency of the NVIDIA GB200 NVL72.
Final yr, Oracle additionally introduced the launch of the world’s first zettascale cloud computing clusters powered by NVIDIA Blackwell GPUs final yr. These clusters provide as much as 131,072 GPUs and ship 2.4 zettaFLOPS peak efficiency.
Batta added that NVIDIA’s DGX Cloud providing can also be hosted on Oracle Cloud Infrastructure. “As we launch GB200 this quarter and later GB300, DGX Cloud will proceed to run on our infrastructure,” he mentioned.
Moreover, Batta talked about that Oracle is collaborating with different cloud service suppliers, akin to Google and Microsoft Azure, to determine multi-cloud partnerships on the infrastructure stage by deploying OCI (Oracle Cloud Infrastructure) inside their knowledge facilities.
“We’re already doing loads with Microsoft by integrating Oracle databases and numerous different providers. With Google, it additionally is smart as a result of their buyer base is completely different from ours—there’s no overlap,” mentioned Batta, including that this leaves room for collaboration, particularly since Google has a robust AI mannequin, Gemini.
Speaking of compute wants, he mentioned it’s not going to decelerate. “It can solely improve as prospects discover extra use instances and extra inferencing to do,” Batta mentioned. Oracle’s technique is to be an open cloud supplier that provides all kinds of AI fashions somewhat than favouring any particular one. “We’re already collaborating with OpenAI, Meta, and Cohere, and we repeatedly replace our choices with the most recent variations.”
OCI x NVIDIA AI Enterprise
Oracle has partnered with NVIDIA AI Enterprise, permitting prospects to speed up AI adoption, together with sovereign AI initiatives. This cloud-native software program platform can be obtainable throughout OCI’s distributed cloud and purchasable utilizing Oracle Common Credit.
Batta mentioned Oracle prospects can now entry the NVIDIA AI Enterprise suite inside Oracle Cloud. He defined that prospects don’t must buy it individually, as a substitute, they’ll use their current Oracle Cloud credit to entry it.
Not like different NVIDIA AI Enterprise choices, OCI will make it accessible immediately by means of the OCI Console, enabling sooner deployment, direct billing, and buyer help.
Prospects can use over 160 AI instruments, together with NVIDIA NIM microservices, to streamline generative AI mannequin deployment. The mixing permits enterprises to construct purposes and handle knowledge throughout a number of deployment environments.
Batta mentioned that for Oracle, ‘distributed cloud’ refers not simply to the business cloud but additionally to environments akin to OCI’s public areas, Authorities Clouds, sovereign clouds, OCI Devoted Area, Oracle Alloy, OCI Compute Cloud@Buyer, and OCI Roving Edge Units.
He additional added that Nomura Analysis Institute (NRI), one of many largest monetary system integrators in Japan, makes use of Oracle Alloy to ship customised cloud providers with NVIDIA Hopper GPUs and plans to deploy NVIDIA AI Enterprise to help AI use instances.
“Half of the Nikkei Index runs by means of their books, and all of that operates on Devoted Area—one in Tokyo and a catastrophe restoration website in Osaka. They’re additionally deploying GPUs and have entry to the NVIDIA AI Enterprise suite of software program as nicely,” Batta mentioned.
Talking of India, he mentioned that Oracle already has two cloud areas in India and is constructing a 3rd.
“We even have a multi-cloud technique in India, the place we’re partnering with AWS, Google, and Microsoft to interconnect our areas and supply our database providers by means of these cloud suppliers,” he concluded.
The put up ‘Most Knowledge Centres Are Not Prepared for Liquid Cooling’, says Oracle Exec on NVIDIA Blackwell appeared first on Analytics India Journal.