Learning Evolution Operators Across PDE Systems: Meta-Learning and Test-Time Generalization
Jiequn Han
Flatiron Institute
Learning reliable surrogate models for complex dynamical systems remains challenging, especially when governing equations are unknown or only partially observed. In this talk, I present an end-to-end framework called DISCO for learning evolution operators directly from short trajectories. The method decouples dynamics identification from state evolution: a hypernetwork infers the parameters of a compact operator network that advances the system forward in time. Trained across a heterogeneous collection of PDE systems spanning, for example, fluid dynamics, active matter, and magnetohydrodynamics, this meta-learning framework infers operators across tasks rather than specializing to a single fixed equation, enabling data-efficient multi-physics learning. Building on this formulation, I introduce a test-time generalization strategy based on neural operator splitting, where unseen dynamics are approximated by composing pretrained operators at inference time rather than retraining model weights. This compositional test-time computation enables zero-shot generalization to unseen regimes. Together, these results highlight structured operator learning and inference-time adaptation as a flexible framework for generalizable surrogate modeling of complex physical systems.
This talk is based on joint work (arXiv:2504.19496; arXiv:2602.00884).