Data driven approximation of parametrized PDEs by Reduced Basis and Neural Networks (Simone Deparis, EPFL)

01.11.2022 14:00

We are interested in the approximation of partial differential equations with a datadriven approach based on the reduced basis method and machine learning. We suppose that the phenomenon of interest can be modeled by a parametrized partial differential equation, but that the value of the physical parameters is unknown or difficult to be directly measured. Our method allows to estimate fields of interest, for instance temperature of a sample of material or velocity of a fluid, given data at a handful of points in the domain. We propose to accomplish this task with a neural network embedding a reduced basis solver as exotic activation function in the last layer. The reduced basis solver accounts for the underlying physical phenomenon and it is constructed from snapshots obtained from randomly selected values of the physical parameters during an expensive offline phase. The same full order solutions are then employed for the training of the neural network. As a matter of fact, the chosen architecture resembles an asymmetric autoencoder in which the decoder is the reduced basis solver and as such it does not contain trainable parameters. The resulting latent space of our autoencoder includes parameter-dependent quantities feeding the reduced basis solver, which – depending on the considered partial differential equation – are the values of the physical parameters themselves or the affine decomposition coefficients of the differential operators.

This method has been presented in [1] in the context of steady parametrized PDEs. However, its extension to time-dependent problems is troublesome, since the numerical approximation of the solution via the reduced basis method may weigh down the training procedure. To circumvent the temporal complexity bottleneck, we propose to employ space-time reduced basis methods, as introduced in [2, 3]. In particular, we focused on unsteady flows, modeled by the (Navier-)Stokes equations. In this framework, stability is a well-known issue and it stems from the saddlepoint structure of the problem at hand. In order to attain it, we either introduce suitable time supremizers that enrich the velocity basis along the temporal dimension or we adopt an ad hoc reduced Petrov-Galerkin projection. Finally, we show the performances of the method on three dimensional problems relevant to vascular flow simulations.

[1] Dal Santo, Deparis, and Pegolotti. Data driven approximation of parametrized PDEs by Reduced Basis and Neural Networks. Journal of Computational Physics
Volume 416, 2020. https://doi.org/10.1016/j.jcp.2020.109550
[2] Y. Choi, and K. Carlberg. Space-time least-squares Petrov-Galerkin projection for nonlinear model reduction. SIAM Journal on Scientific Computing, 41(1): A26–A58, 2019
[3] Y. Choi, P. Brown, W. Arrighi, R. Anderson, and K. Huynh. Space-time reduced order model for large-scale linear dynamical systems with application to Boltzmann transport problems. Journal of Computational Physics, 424, 2021

Lieu

Bâtiment: Conseil Général 7-9

Room 1-05, Séminaire d'analyse numérique

Organisé par

Section de mathématiques

Intervenant-e-s

Simone Deparis, EPFL

entrée libre

Classement

Catégorie: Séminaire

Mots clés: analyse numérique