Private Homepagehttps://www.uni-muenster.de/AMM/Jentzen/Mitarbeiter/included.shtml
Research InterestsMathematics for machine learning
Numerical approximations for high-dimensional partial differential equations
Numerical approximations for stochastic differential equations
Deep Learning
Selected PublicationsHutzenthaler M, Jentzen A, Kloeden PE Strong and weak divergence in finite time of Euler's method for stochastic differential equations with non-globally Lipschitz continuous coefficients. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences Vol. 467 (2130), 2011, pp 1563-1576 online
Hairer M, Hutzenthaler M, Jentzen A Loss of regularity for Kolmogorov equations. Annals of Probability Vol. 43 (2), 2015, pp 468-527 online
Hutzenthaler M, Jentzen A Numerical approximations of stochastic differential equations with non-globally Lipschitz continuous coefficients. Memoirs of the American Mathematical Society Vol. 236 (1112), 2015, pp v+99 online
E W, Han J, Jentzen A Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations. Communications in Mathematics and Statistics Vol. 5 (4), 2017, pp 349-380 online
Han J, Jentzen A, E W Solving high-dimensional partial differential equations using deep learning. Proceedings of the National Academy of Sciences of the United States of America Vol. 115 (34), 2018, pp 8505-8510 online
Hutzenthaler M, Jentzen A, Kloeden PE Strong convergence of an explicit numerical method for SDEs with nonglobally Lipschitz continuous coefficients. Ann. Appl. Probab. Vol. 22 (4), 2012, pp 1611-1641 online
Fehrman B, Gess B, Jentzen A Convergence rates for the stochastic gradient descent method for non-convex objective functions. Journal of Machine Learning Research Vol. 21, 2020, pp Paper No. 136, 48 online
Hutzenthaler M, Jentzen A, Kruse T, Nguyen TA, von Wurstemberger P Overcoming the curse of dimensionality in the numerical approximation of semilinear parabolic partial differential equations. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences Vol. 476 (2244), 2020, pp 630-654 online
Hutzenthaler M, Jentzen A On a perturbation theory and on strong convergence rates for stochastic ordinary and partial differential equations with nonglobally monotone coefficients. Annals of Probability Vol. 48 (1), 2020, pp 53-93 online
Selected ProjectsMathematical Theory for Deep Learning It is the key goal of this project to provide a rigorous mathematical analysis for deep learning algorithms and thereby to establish mathematical theorems which explain the success and the limitations of deep learning algorithms. In particular, this projects aims (i) to provide a mathematical theory for high-dimensional approximation capacities for deep neural networks, (ii) to reveal suitable regular sequences of functions which can be approximated by deep neural networks but not by shallow neural networks without the curse of dimensionality, and (iii) to establish dimension independent convergence rates for stochastic gradient descent optimization algorithms when employed to train deep neural networks with error constants which grow at most polynomially in the dimension. online
Existence, uniqueness, and regularity properties of solutions of partial differential equations The goal of this project is to reveal existence, uniqueness, and regularity properties of solutions of partial differential equations (PDEs). In particular, we intend to study existence, uniqueness, and regularity properties of viscosity solutions of degenerate semilinear Kolmogorov PDEs of the parabolic type. We plan to investigate such PDEs by means of probabilistic representations of the Feynman-Kac type. We also intend to study the connections of such PDEs to optimal control problems. online
Regularity properties and approximations for stochastic ordinary and partial differential equations with non-globally Lipschitz continuous nonlinearities A number of stochastic ordinary and partial differential equations from the literature (such as, for example, the Heston and the 3/2-model from financial engineering, (overdamped) Langevin-type equations from molecular dynamics, stochastic spatially extended FitzHugh-Nagumo systems from neurobiology, stochastic Navier-Stokes equations, Cahn-Hilliard-Cook equations) contain non-globally Lipschitz continuous nonlinearities in their drift or diffusion coefficients. A central aim of this project is to investigate regularity properties with respect to the initial values of such stochastic differential equations in a systematic way. A further goal of this project is to analyze the regularity of solutions of the deterministic Kolmogorov partial dfferential equations associated to such stochastic differential equations. Another aim of this project is to analyze weak and strong convergence and convergence rates of numerical approximations for such stochastic differential equations. online
Overcoming the curse of dimensionality: stochastic algorithms for high-dimensional partial differential equations Partial differential equations (PDEs) are among the most universal tools used in modeling problems in nature and man-made complex systems. The PDEs appearing in applications are often high dimensional. Such PDEs can typically not be solved explicitly and developing efficient numerical algorithms for high dimensional PDEs is one of the most challenging tasks in applied mathematics. As is well-known, the difficulty lies in the so-called ''curse of dimensionality'' in the sense that the computational effort of standard approximation algorithms grows exponentially in the dimension of the considered PDE. It is the key objective of this research project to overcome this curse of dimensionality and to construct and analyze new approximation algorithms which solve high dimensional PDEs with a computational effffort that grows at most polynomially in both the dimension of the PDE and the reciprocal of the prescribed approximation precision. online
Project membership
Mathematics Münster


C: Models and Approximations

C1: Evolution and asymptotics
C4: Geometry-based modelling, approximation, and reduction
Current PublicationsJentzen A, Riekert A Convergence analysis for gradient flows in the training of artificial neural networks with ReLU activation. Journal of Mathematical Analysis and Applications Vol. 517 (2), 2023 online
E W, Han J, Jentzen A Algorithms for Solving High Dimensional PDEs: From Nonlinear Monte Carlo to Machine Learning. SIAM Review Vol. 35 (1), 2022, pp 278-310 online
Jentzen, Arnulf; Riekert, Adrian A proof of convergence for the gradient descent optimization method with random initializations in the training of neural networks with ReLU activation for piecewise linear target functions. Journal of Machine Learning Research Vol. 23 (260), 2022 online
Jentzen, Arnulf; Riekert, Adrian A proof of convergence for stochastic gradient descent in the training of artificial neural networks with ReLU activation for constant target functions. Zeitschrift für Angewandte Mathematik und Physik Vol. 73, 2022 online
Cheridito, Patrick; Jentzen, Arnulf; Riekert, Adrian; Rossmannek, Florian A proof of convergence for gradient descent in the training of artificial neural networks for constant target functions. Journal of Complexity Vol. 72, 2022 online
Ibragimov, Shokhrukh; Jentzen, Arnulf; Kröger, Timo; Riekert, Adrian On the existence of infinitely many realization functions of non-global local minima in the training of artificial neural networks with ReLU activation. , 2022 online
Eberle, Simon; Jentzen, Arnulf; Riekert, Adrian; Weiss, Georg Normalized gradient flow optimization in the training of ReLU artificial neural networks. , 2022 online
Gonon L, Grohs P, Jentzen A, Kofler D, Šiška D Uniform error estimates for artificial neural network approximations for heat equations. IMA Journal of Numerical Analysis Vol. 0, 2021 online
Hutzenthaler M, Jentzen A, Kruse T Overcoming the curse of dimensionality in the numerical approximation of parabolic partial differential equations with gradient-dependent nonlinearities. Foundations of Computational Mathematics Vol. 0, 2021 online
E-Mailajentzen@uni-muenster.de
Phone+49 251 83-33793
FAX+49 251 83-32729
Room120.005
Secretary   Sekretariat Claudia Giesbert
Frau Claudia Giesbert
Telefon +49 251 83-33792
Fax +49 251 83-32729
Zimmer 120.002
AddressProf. Dr. Arnulf Jentzen
Angewandte Mathematik Münster: Institut für Analysis und Numerik
Fachbereich Mathematik und Informatik der Universität Münster
Orléans-Ring 10
48149 Münster
Diese Seite editieren