NVIDIA Pioneers Physical AI in Healthcare

NVIDIA chief Jensen Huang believes the next wave of AI is Physical AI. “Everything is going to be robotic,” he said, adding that there would be an entire ecosystem of robots, where all factories will orchestrate robots and those robots will, in turn, build robotic products. NVIDIA is banking on Omniverse to make this happen.

Physical AI, which uses advanced simulations and learning methods, is transforming sectors like manufacturing and healthcare by enhancing the ability of robots and infrastructure to perceive, reason, and navigate.

In an interview with AIM, Kimberly Powell, business development manager for Healthcare at NVIDIA, said, “We are on the cusp of a breakthrough moment for physical AI—models that can understand, interact, and navigate the world. Physical AI will embody robotic systems.”

Powell added that medical devices, treatment facilities, and operating rooms will all be robotic systems, embodied by physical AI.

Speaking on the development, a Reddit user asserted, “I feel like the Omniverse in general is a huge step towards AGI if used properly. If we properly mimic the real world, AI can learn about it at an accelerated pace.”

What does it take to Build Physical AI?

Powell explained that three computers and software stacks are required to create physical AI. The models are first trained on DGX before being tested and validated on NVIDIA Omniverse. NVIDIA Holoscan then deploys AI to the real world, on robots.

The NVIDIA Inception program, which supports over 3,000 healthcare and life sciences (HCLS) startups globally, plays a critical role in the growth of these fields. The free program is designed to help startups evolve faster through cutting-edge technology, provide them with opportunities to connect with venture capitalists and give them access to NVIDIA’s latest technical resources.

According to Powell, HCLS’ startup adoption of Omniverse is in its early stages, as this new industry of physical AI-embodied systems for robotics is just emerging.

She goes on to provide an example of NVIDIA Inception member

Medical IP, which built a digital twin-based medical AI solution called MEDIP PRO using NVIDIA Omniverse.

Powell said that MEDIP PRO leverages Omniverse to allow users to build 3D-model medical images both in real-time and virtual spaces, expanding the simulation of the human body into the metaverse without time and space limitations.

“Now FDA-approved, Medical IP’s use of Omniverse has enabled use cases from virtual surgery and blood flow simulation to XR augmentation of medical images,” she added.

Innovations on the Horizon

Similarly, Atlas Meditech, a brain-surgery intelligence platform, is adopting tools — including the MONAI medical imaging framework and NVIDIA Omniverse 3D development platform — to build AI-powered decision support and high-fidelity surgery rehearsal platforms.

Using NVIDIA Omniverse, the company develops a virtual operating room that can immerse surgeons into a realistic environment to rehearse upcoming procedures. In the simulation, surgeons can modify how the patient and equipment are positioned.

The Atlas Pathfinder tool has adopted the MONAI Label to create digital twins of patients’ brains. This tool can support radiologists by automatically annotating MRI and CT scans to segment normal structures and tumours.

Moon Surgical is designing Maestro, an accessible and adaptive surgical assistant robotic system. It is leveraging NVIDIA Clara Holoscan and Omniverse to develop and test its AI algorithm in a simulated environment.

In fact, in 2024, Microsoft and NVIDIA announced an expanded collaboration to enhance healthcare and life sciences. This partnership integrated Microsoft’s Azure platform with NVIDIA’s DGX Cloud and Clara suite.

Powell explained that the convergence of AI, cloud computing and healthcare is set to transform patient care. “Our collaboration with Microsoft will help unlock new possibilities and drive meaningful impact for patients worldwide,” she added.

NVIDIA BioNeMo, hosted on NVIDIA DGX Cloud on Azure, accelerates AI model development for drug discovery. Combining Azure’s capabilities with NVIDIA MONAI and the Nuance Precision Imaging Network (PIN), this collaboration facilitates large-scale, secure medical imaging AI model development, validation, and deployment.

Furthermore, Flywheel integrates MONAI with Azure AI to offer an optimised imaging data management solution, improving data curation and reducing research preparation and training time.

Microsoft, NVIDIA and SOPHiA GENETICS are partnering to create a scalable, comprehensive whole genome analytical solution. This integrates SOPHiA DDM’s SaaS platform on Azure, powered by NVIDIA Parabricks, for advanced genomic analysis.

Furthermore, NVIDIA Parabricks v4.4 introduces new features and functionality including accelerated pangenome graph alignment, as announced at the American Society of Human Genetics (ASHG) national meeting.

“In addition to improving human-robot interaction in healthcare, we are focused on advancing digital health agents and drug discovery and designing AI factories,” says Powell.

The post NVIDIA Pioneers Physical AI in Healthcare appeared first on Analytics India Magazine.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Oldest
New Most Voted
Inline Feedbacks
View all comments

Latest stories

You might also like...