More Details¤
Background¤
Neural Emulators are networks learned to efficienty predict physical phenomena, often associated with PDEs. In the simplest case this can be a linear advection equation, all the way to more complicated Navier-Stokes cases. If we work on Uniform Cartesian grids* (which this package assumes), one can borrow plenty of architectures from image-to-image tasks in computer vision (e.g., for segmentation). This includes:
- Standard Feedforward ConvNets
- Convolutional ResNets (He et al.)
- U-Nets (Ronneberger et al.)
- Dilated ResNets (Yu et al., Stachenfeld et al.)
- Fourier Neural Operators (Li et al.)
It is interesting to note that most of these architectures resemble classical numerical methods or at least share similarities with them. For example, ConvNets (or convolutions in general) are related to finite differences, while U-Nets resemble multigrid methods. Fourier Neural Operators are related to spectral methods. The difference is that the emulators' free parameters are found based on a (data-driven) numerical optimization not a symbolic manipulation of the differential equations.
(*) This means that we essentially have a pixel or voxel grid on which space is
discretized. Hence, the space can only be the scaled unit cube
Boundary Conditions¤
This package assumes that the boundary condition is baked into the neural
emulator. Hence, most components allow setting boundary_mode
which can be
"dirichlet"
, "neumann"
, or "periodic"
. This affects what is considered a
degree of freedom in the grid.
Dirichlet boundaries fully eliminate degrees of freedom on the boundary. Periodic boundaries only keep one end of the domain as a degree of freedom (This package follows the convention that the left boundary is the degree of freedom). Neumann boundaries keep both ends as degrees of freedom.
Constructors¤
There are two primary architectural constructors for Sequential and Hierarchical
Networks that allow for composability with the PDEquinox
blocks.
Sequential Constructor¤
The squential network constructor is defined by:
* a lifting block
Hierarchical Constructor¤
The hierarchical network constructor is defined by:
* a lifting block
Beyond Architectural Constructors¤
For completion, pdequinox.arch
also provides a ConvNet
which is a simple
feed-forward convolutional network. It also provides MLP
which is a dense
networks which also requires pre-defining the number of resolution points.
Related¤
Similar packages that provide a collection of emulator architectures are PDEBench and PDEArena. With focus on Phyiscs-informed Neural Networks and Neural Operators, there are also DeepXDE and NVIDIA Modulus.