Generating synthetic data for neural operators

Rachel Ward
University of Texas at Austin
Mathematics

Numerous developments show the promising potential of deep learning in obtaining numerical solutions to partial differential equations beyond the reach of current numerical solvers. However, data-driven neural operators tend to suffer from the issue that the data needed to train a network depends on classical numerical solvers such as finite difference or finite element, among others. We propose a different approach to generating synthetic functional training data that does not require solving a PDE numerically. The proposed `backwards' approach to generating training data only requires derivative computations, in contrast to standard `forward' approaches, which require a numerical PDE solver, enabling us to generate many data points quickly and efficiently. Open research directions and limitations are discussed.


Back to Sampling, Inference, and Data-Driven Physical Modeling in Scientific Machine Learning