|
Stephan Rave

Sven Ullmann (Uni Stuttgart/Uni Münster): Model Order Reduction using Group Convolutional Autoencoders

Wednesday, 10.06.2026 14:15 im Raum M5

Mathematik und Informatik

Solving physics-based models governed by differential equations is relevant across engineering and the natural sciences, often requiring large computational resources. Model order reduction offers a principled approach to derive low-dimensional, computationally efficient reduced-order models, and is particularly valuable when a full-order model must be solved repeatedly for varying parameter values or when fast evaluations are required. Classical projection-based model order reduction techniques rely on linear trial spaces, typically constructed via methods such as proper orthogonal decomposition. While effective for diffusion-dominated problems, linear subspace approaches suffer from a fundamental expressivity limitation for transport-dominated problems, where the slow decay of the Kolmogorov $n$-width renders linear approximations computationally infeasible. To overcome this limitation, recent research has shifted toward nonlinear MOR, where the reduced approximation is sought on a nonlinear trial manifold. A prominent realization of this idea is the use of convolutional autoencoders to learn nonlinear reduced manifolds directly from snapshot data.

In such autoencoders, convolutional layers are by construction equivariant w.r.t. translations: convolving a shifted input yields the same result as shifting the convolved output. Interpreting translations as the action of the group $G = (\mathbb{R}^2, +)$, a convolutional layer $\Phi$ satisfies $\Phi(\mathscr{L}_y f) = \mathscr{L}_y \Phi(f)$ for all $y \in G$. Group convolutional neural networks extend this principle to broader symmetry groups, such as rotations or reflections, yielding layers that are equivariant w.r.t. these additional transformations. In this work, we propose a group convolutional autoencoder framework for nonlinear model order reduction. By constructing networks where layers are equivariant with respect to a chosen group action, the learned reduced manifold generalizes to transformed, e.g. rotated, variants of the system without retraining or extensive data augmentation. We demonstrate this approach on a 2D linear wave equation parametrized by the wave speed, a problem known to exhibit a slowly decaying Kolmogorov $n$-width. Since the wave equation admits a Hamiltonian formulation, we further incorporate structure-preserving model order reduction by enforcing weak symplecticity of the decoder via a symplectic loss term during training, inducing a symplectic structure on the trial manifold and yielding reduced order models that conserve energy.



Angelegt am 19.02.2026 von Stephan Rave
Geändert am 05.03.2026 von Stephan Rave
[Edit | Vorlage]

Oberseminar Numerik