AI Framework DIMON Cracks Complex Equations at Supercomputer Speed—No HPC Required

Researchers at Johns Hopkins University have created a new AI framework that can quickly predict solutions to partial differential equations (PDEs) in scientific and engineering research.

Diffeomorphic Mapping Operator Learning, or DIMON, is a framework that solves partial differential equations using desktop computers at a rate that is thousands of times faster than a supercomputer. The research appears in the journal Nature Computational Science.

Modeling Dynamic Structures: A Computational Bottleneck

Scientists use partial differential equations to create mathematical models that describe how physical systems evolve across time and space, enabling predictions of real-world changes in objects and environments.

Solving PDEs has long been central to scientific and engineering challenges, from simulating fluid dynamics to predicting heart function. But, as the researchers note in their abstract, related computational costs can quickly add up, depending on complexity.

When modeling complex structures like aircraft components or biological tissues, researchers typically rely on numerical methods that divide the shapes into smaller grids. This allows calculations to be performed on each section individually before combining the results into a complete solution. However, when the structure’s shape changes, the grids must be updated and the solutions recalculated, leading to higher computing costs and slower results.

DIMON uses AI to understand how physical systems behave across different shapes, without needing to recalculate everything from scratch for each new shape, according to a report on Johns Hopkins’ Hub. Instead of dividing shapes into grids and solving equations over and over, the AI predicts how factors such as heat, stress, or motion will behave based on patterns it has learned, making it much faster and more efficient in tasks like optimizing designs or modeling shape-specific scenarios, the report said.

Speeding Up Heart Risk Assessments with DIMON

Natalia Trayanova is a Johns Hopkins University biomedical engineering and medicine professor who co-led the research. Trayanova's team studies cardiac arrhythmia, which is an irregular heartbeat caused by disruptions in the heart's electrical signaling.

A figure from the research paper showing cardiac magnetic resonance scans and network prediction of activation times and repolarization times of a propagating electrical signal. (Source: Nature Computational Science)

The team is incorporating data related to cardiac pathology that leads to arrhythmia into the DIMON framework to model human hearts and more quickly predict arrhythmia risk.

Trayanova, who also directs the Johns Hopkins Alliance for Cardiovascular Diagnostic and Treatment Innovation, says it takes about a week from when a patient’s heart is scanned to when doctors can assess their risk for sudden cardiac death and develop a treatment plan. With DIMON, this lag time is reduced to seconds and does not require supercomputing infrastructure.

“With this new AI approach, the speed at which we can have a solution is unbelievable,” she said. “The time to calculate the prediction of a heart digital twin is going to decrease from many hours to 30 seconds, and it will be done on a desktop computer rather than on a supercomputer, allowing us to make it part of the daily clinical workflow.”

DIMON is also enabling more long-term cardiac research. The AI framework was tested by Trayanova's team on over 1,000 heart digital twins modeled from real patients’ hearts. The platform was able to predict how electrical signals propagated through each unique heart shape, achieving high prognostic accuracy, according to Johns Hopkins Hub.

DIMON's Broad Potential Across Scientific Fields

One of the most exciting factors of this new research tool is its potential universality for other research applications.

Memory usage estimation given training data at the spatial resolution of CT (dashed) and MRI (dashed and dot) using inefficient (red regime) and efficient (blue regime) approaches of geometry parameterization. (Source: Nature Computational Science)

“While the motivation to develop it came from our own work, this is a solution that we think will have generally a massive impact on various fields of engineering because it's very generic and scalable,” Trayanova said in the Hub report. “It can work basically on any problem, in any domain of science or engineering, to solve partial differential equations on multiple geometries, like in crash testing, orthopedics research, or other complex problems where shapes, forces, and materials change.”

Minglang Yin, a Johns Hopkins Biomedical Engineering postdoctoral fellow who developed the platform, says DIMON’s versatility allows it to be applied to shape optimization and other engineering tasks that require frequent solutions of partial differential equations on new shapes.

“For each problem, DIMON first solves the partial differential equations on a single shape and then maps the solution to multiple new shapes. This shape-shifting ability highlights its tremendous versatility,” Yin said. “We are very excited to put it to work on many problems as well as to provide it to the broader community to accelerate their engineering design solutions.”

AI’s Role in Democratizing Scientific Innovation

As AI-driven tools like DIMON continue to evolve, they promise to unlock new scientific possibilities, making research faster, more accessible, and less dependent on vast computational resources.

Traditional scientific simulations, such as those used in weather forecasting, fluid dynamics, or materials science, often depend on high performance computing clusters and supercomputers. These simulations involve solving complex mathematical equations that demand significant computational power and time.

AI-driven approaches offer an alternative by learning patterns and relationships within scientific data to approximate the outcomes of computationally intensive simulations, reducing the need for repetitive calculations.

Leveraging AI for science allows researchers to conduct large-scale experiments and run simulations with fewer hardware requirements, which can democratize access to scientific breakthroughs across many disciplines.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Oldest
New Most Voted
Inline Feedbacks
View all comments

Latest stories

You might also like...