From Numerical Simulators to Neural Emulators and Back (Talk at RISE ML Seminar)

1 minute read

Published:

I had the honor to be invited to the RISE ML seminar series and speak about my current research. You can find the recording here and the slides here.

In it, I build a bigger narrative arc around the results from APEBench and my recent NeurIPS 2025 paper on Neural Emulator Superiority.

Below is the abstract of the talk:

The potential for computational speedups and tackling unsolved problems has motivated the use of neural networks (NNs) for helping solve PDEs. In particular, image-like models that operate on state-discrete representations that approach time autoregressively gained popularity over the past years.

In this talk, I will present a holistic perspective on learning autoregressive neural emulators from simulated data, starting with the synthetic data generation using classical numerical simulators, covering the training process and ultimately investigating their benchmarking. Using a wide range of experiments with different PDEs and neural architectures, I will highlight the similarities between emulators and simulators. This shows how emulator architectures were inspired by classical schemes for solving the laws of nature and thereby inherit both their merits and limitations.

Moreover, I will elaborate on the impact of reference data fidelity and discuss a counterintuitive yet interesting finding when emulators can become better than their training data source.