AI’s insatiable demand for knowledge has uncovered a rising drawback: storage infrastructure isn’t maintaining. From coaching basis fashions to working real-time inference, AI workloads require high-throughput, low-latency entry to huge quantities of information unfold throughout cloud, edge, and on-prem environments. Conventional storage programs have typically struggled below the load of those calls for, creating bottlenecks that may drastically delay innovation within the AI area.
In the present day, DDN unveiled Infinia 2.0, a major replace to its AI-focused, software-defined knowledge storage platform designed to remove the inefficiencies in AI storage and knowledge administration. The corporate says Infinia 2.0 acts as a unified, clever knowledge layer that dynamically optimizes AI workflows.
“Infinia 2.0 isn’t just an improve—it’s a paradigm shift in AI knowledge administration,” DDN CEO Alex Bouzari says, emphasizing how Infinia builds on the corporate’s deep-rooted experience in HPC storage to energy the following technology of AI-driven knowledge providers.
A rendering of a large-scale Infinia 2.0 configuration from DDN's Past Synthetic digital occasion.
As AI adoption grows, the challenges of scale, pace, and effectivity develop into extra obvious. LLMs, generative AI purposes, and inference programs require not solely large datasets however the potential to entry and course of them quicker than ever. Conventional storage options wrestle with efficiency bottlenecks, making it troublesome for GPUs to obtain the information they want shortly sufficient, limiting general coaching effectivity. On the similar time, organizations should navigate the fragmentation of information throughout a number of areas, from structured databases to unstructured video and sensory knowledge. Transferring knowledge between these environments creates inefficiencies, driving up operational prices and creating latency points that sluggish AI purposes.
DDN claims Infinia 2.0 solves these challenges by integrating real-time AI knowledge pipelines, dynamic metadata-driven automation, and multi-cloud unification, all optimized particularly for AI workloads. Fairly than forcing enterprises to work with disconnected knowledge lakes, Infinia 2.0 introduces a Knowledge Ocean, a unified world view that eliminates redundant copies and permits organizations to course of and analyze their knowledge wherever it resides. That is meant to cut back storage sprawl and to permit AI fashions to go looking and retrieve related knowledge extra effectively utilizing a complicated metadata tagging system. With just about limitless metadata capabilities, AI purposes can affiliate huge quantities of metadata with every object, making search and retrieval operations dramatically quicker.
Infinia 2.0 integrates with frameworks like TensorFlow and PyTorch, which the corporate says eliminates the necessity for advanced format conversions, permitting AI execution engines to work together with knowledge on to considerably pace up processing occasions. The platform can also be designed for excessive scalability, supporting deployments that vary from a couple of terabytes to exabytes of storage, making it versatile sufficient to satisfy the wants of each startups and enterprise-scale AI operations.
Efficiency is one other space the place Infinia 2.0 might be a breakthrough. The platform boasts 100x quicker metadata processing, decreasing lookup occasions from over ten milliseconds to lower than one. AI pipelines execute 25x quicker, whereas the system can deal with as much as 600,000 object lists per second, surpassing the constraints of even AWS S3. By leveraging these capabilities, DDN asserts that AI-driven organizations can guarantee their fashions are skilled, refined, and deployed with minimal lag and most effectivity.
(Supply: DDN)
Throughout a digital launch occasion at the moment referred to as Past Synthetic, DDN’s claims had been bolstered by sturdy endorsements from trade leaders like Nvidia CEO Jensen Huang, who highlighted Infinia’s potential to redefine AI knowledge administration, emphasizing how metadata-driven architectures like Infinia rework uncooked knowledge into actionable intelligence. Enterprise computing chief Lenovo additionally praised the platform, underscoring its potential to merge on-prem and cloud knowledge for extra environment friendly AI deployment.
Supermicro, one other DDN associate, additionally endorses Infinia: “At Supermicro, we’re proud to associate with DDN to rework how organizations leverage knowledge to drive enterprise success," mentioned Charles Liang, founder, president, and CEO at Supermicro. “By combining Supermicro’s high-performance, energy-efficient {hardware} with DDN’s revolutionary Infinia platform, we empower prospects to speed up AI workloads, maximize operational effectivity, and scale back prices. Infinia’s seamless knowledge unification throughout cloud, edge, and on-prem environments permits companies to make quicker, data-driven choices and obtain measurable outcomes, aligning completely with our dedication to delivering optimized, sustainable infrastructure options.”
On the Past Synthetic occasion, Bouzari and Huang sat down for a fireplace chat to mirror on how a earlier thought, born from a 2017 assembly with Nvidia, advanced into the Infinia platform.
DDN had been requested to assist construct a reference structure for AI computing, however Bouzari noticed a a lot larger alternative. If Huang’s imaginative and prescient for AI was going to materialize, the world would want a essentially new knowledge structure, one that might scale AI workloads, remove latency, and rework uncooked data into actionable intelligence.
On the Past Synthetic occasion, Huang and Bouzari sit down for a fireplace chat concerning the larger image of storage and AI.
Infinia is extra than simply storage, Bouzari says, and fuels AI programs the best way power fuels a mind. And in keeping with Huang, that distinction is important.
“Probably the most vital issues individuals neglect is the significance of information that’s essential throughout software, not simply throughout coaching,” Huang notes. "You wish to practice on an enormous quantity of information for pretraining, however throughout use, the AI has to entry data, and AI want to entry data, not in uncooked knowledge kind, however in informational movement.”
This shift from conventional storage to AI-native knowledge intelligence has profound implications, the CEOs say. As an alternative of treating storage as a passive repository, DDN and Nvidia are turning it into an energetic layer of intelligence, enabling AI to retrieve insights immediately.
“That is the explanation why the reframing of storage of objects and uncooked knowledge into knowledge intelligence is that this new alternative for DDN, offering knowledge intelligence for the entire world's enterprises as AIs run on high of this material of data,” Huang says, calling it “a rare reframing of computing and storage.”
Reframing actually appears essential as AI continues to evolve as a result of the infrastructure supporting it should evolve as effectively. DDN’s Infinia 2.0 might characterize a serious shift in how enterprises method AI storage, not as a passive archive, however as an energetic intelligence layer that fuels AI programs in actual time. By eliminating conventional bottlenecks, unifying distributed knowledge, and integrating seamlessly with AI frameworks, Infinia 2.0 goals to reshape how AI purposes entry, course of, and act on data.
With endorsements from trade leaders like Nvidia, Supermicro, and Lenovo, and with its newest funding spherical of $300 million at a $5 billion valuation, DDN is positioning itself as a key participant within the AI panorama. Whether or not Infinia 2.0 delivers on its bold guarantees stays to be seen, however one factor is obvious: AI's subsequent frontier isn’t nearly fashions and compute however is about rethinking knowledge itself. And with this launch, DDN is making the case that the way forward for AI hinges on new paradigms for knowledge administration.
Study extra concerning the technical elements of Infinia 2.0 at this hyperlink, or watch a replay of Past Synthetic right here.