Arnulf Jentzen -- University of Münster
Arnulf Jentzen
University of Münster
The Chinese University of Hong Kong, Shenzhen

Address at the University of Münster:
Prof. Dr. Arnulf Jentzen
Institute for Analysis and Numerics
Applied Mathematics Münster
Faculty of Mathematics and Computer Science
University of Münster
Einsteinstraße 62
48149 Münster
Germany

Office: Room 120.005
Fon (Secretariat): +49 251 83-33792
Fax: +49 251 83-32729
Office hour: on appointment

E-mail: ajentzen (at) uni-muenster.de
Homepage at the University of Münster: https://www.uni-muenster.de/AMM/en/Jentzen/Mitarbeiter/Jentzen.shtml
Personal homepage: http://www.ajentzen.de
Born: November 1983 (age 37)

Links: [Profile on Google Scholar] [Profile on ResearchGate] [Profile on MathSciNet] [ETH Webmail]
Last update of this homepage: April 4th, 2021

Short Curriculum Vitae

2004—2007Diploma studies in Mathematics,
Faculty of Computer Science and Mathematics, Goethe University Frankfurt
2007—2009PhD studies in Mathematics,
Faculty of Computer Science and Mathematics, Goethe University Frankfurt
2009—2010Assistant Professor (Akademischer Rat a.Z.),
Faculty of Mathematics, Bielefeld University
2011—2012Research Fellowship (German Research Foundation),
Program in Applied and Computational Mathematics, Princeton University
2012—2019Assistant Professor for Applied Mathematics,
Department of Mathematics, ETH Zurich
2019—Full Professor for Applied Mathematics,
Faculty of Mathematics and Computer Science, University of Münster
2021—Full Professor for Data Science,
School of Data Science, The Chinese University of Hong Kong, Shenzhen

Research group

Current members of the research group

  • Christian Beck (PhD student at D-MATH, ETH Zurich, joint supervision with Prof. Dr. Norbert Hungerbühler)
  • Robin Gräber (PhD Student at the Faculty of Mathematics and Computer Science, University of Muenster)
  • Prof. Dr. Arnulf Jentzen (Head of the research group)
  • Shokhrukh Ibragimov (PhD Student at the Faculty of Mathematics and Computer Science, University of Muenster)
  • Timo Kröger (PhD Student at the Faculty of Mathematics and Computer Science, University of Muenster)
  • Dr. Benno Kuckuck (Postdoc at the Faculty of Mathematics and Computer Science, University of Muenster)
  • Adrian Riekert (PhD Student at the Faculty of Mathematics and Computer Science, University of Muenster)
  • Florian Rossmannek (PhD student at D-MATH, ETH Zurich, joint supervision with Prof. Dr. Patrick Cheridito)
  • Philippe von Wurstemberger (PhD student at D-MATH, ETH Zurich, joint supervision with Prof. Dr. Patrick Cheridito)
  • Philipp Zimmermann (PhD student at D-MATH, ETH Zurich, joint supervision with Prof. Dr. Patrick Cheridito)

Former members of the research group

  • Dr. Sebastian Becker (former PhD student, joint supervision with Prof. Dr. Peter E. Kloeden, 2010-2017, now Postdoc at ETH Zurich)
  • Prof. Dr. Sonja Cox (former Postdoc/Fellow, 2012-2014, now Associate Professor at the University of Amsterdam)
  • Dr. Fabian Hornung (former Postdoc/Fellow, 2018-2018, now at SAP)
  • Prof. Dr. Raphael Kruse (former Postdoc, 2012-2014, now Associate Professor at the Martin Luther University Halle-Wittenberg)
  • Dr. Ryan Kurniawan (former PhD student, 2014-2018, now Associate at Market Risk Analytics at Morgan Stanley UK Ltd.)
  • Prof. Dr. Ariel Neufeld (former Postdoc/Fellow, joint mentoring with Prof. Dr. Patrick Cheridito, 2018-2018, now Assistant Professor at NTU Singapore)
  • Dr. Primoz Pusnik (former PhD Student, 2014-2020, now Quantitative Developer at Vontobel)
  • Dr. Diyora Salimova (former PhD student, 2015-2019, now Postdoc at ETH Zurich)
  • Prof. Dr. Michaela Szoelgyenyi (former Postdoc/Fellow, 2017-2018, now Full Professor at the University of Klagenfurt)
  • Dr. Timo Welti (former PhD Student, 2015-2020, now Data Analytics Consultant at D ONE Solutions AG)
  • Dr. Larisa Yaroslavtseva (former Postdoc, 2018-2018, now interim professor at the University of Ulm)

Research areas

  • Machine learning (mathematics for deep learning, stochastic gradient descent methods, deep neural networks, empirical risk minimization)
  • Stochastic analysis (stochastic calculus, well-posedness and regularity analysis for stochastic ordinary and partial differential equations)
  • Numerical analysis (computational stochastics/stochastic numerics, computational finance)
  • Analysis of partial differential equations (well-posedness and regularity analysis for partial differential equations)

Current editorial boards affiliations

Past editorial boards affiliations

Preprints

  • Riekert, A., Jentzen, A., A proof of convergence for stochastic gradient descent in the training of artificial neural networks with ReLU activation for constant target functions. [arXiv] (2021), 29 pages.
  • Cheridito, P., Jentzen, A., Rossmannek, F., Landscape analysis for shallow ReLU neural networks: complete classification of critical points for affine target functions. [arXiv] (2021), 19 pages.
  • Grohs, P., Ibragimov, S., Jentzen, A., Koppensteiner, S., Lower bounds for artificial neural network approximations: A proof that shallow neural networks fail to overcome the curse of dimensionality. [arXiv] (2021), 53 pages.
  • Beck, C., Hutzenthaler, M., Jentzen, A., Magnani, E., Full history recursive multilevel Picard approximations for ordinary differential equations with expectations. [arXiv] (2021), 24 pages.
  • Jentzen, A., Kröger, T., Convergence rates for gradient descent in the training of overparameterized artificial neural networks with biases. [arXiv] (2021), 38 pages.
  • Cheridito, P., Jentzen, A., Riekert, A., Rossmannek, F., A proof of convergence for gradient descent in the training of artificial neural networks for constant target functions. [arXiv] (2021), 23 pages.
  • Beck, C., Hutzenthaler, M., Jentzen, A., Kuckuck, B., An overview on deep learning-based approximation methods for partial differential equations. [arXiv] (2020), 22 pages.
  • Jentzen, A., Riekert, A., Strong overall error analysis for the training of artificial neural networks via random initializations. [arXiv] (2020), 40 pages.
  • Beneventano, P., Cheridito, P., Jentzen, A., von Wurstemberger, P., High-dimensional approximation spaces of artificial neural networks and applications to partial differential equations. [arXiv] (2020), 32 pages.
  • Beck, C., Becker, S., Cheridito, P., Jentzen, A., Neufeld, A., Deep learning based numerical approximation algorithms for stochastic partial differential equations and high-dimensional nonlinear filtering problems. [arXiv] (2020), 58 pages.
  • Beck, C., Jentzen, A., Kruse, T., Nonlinear Monte Carlo methods with polynomial runtime for high-dimensional iterated nested expectations. [arXiv] (2020), 47 pages.
  • Hutzenthaler, M., Jentzen, A., Kruse, T., Nguyen, T., Multilevel Picard approximations for high-dimensional semilinear second-order PDEs with Lipschitz nonlinearities. [arXiv] (2020), 37 pages.
  • E, W., Han, J., Jentzen, A., Algorithms for Solving High Dimensional PDEs: From Nonlinear Monte Carlo to Machine Learning. [arXiv] (2020), 40 pages.
  • Bercher, A., Gonon, L., Jentzen, A., Salimova, D., Weak error analysis for stochastic gradient descent optimization algorithms. [arXiv] (2020), 123 pages.
  • Hornung, F., Jentzen, A., Salimova, D., Space-time deep neural network approximations for high-dimensional partial differential equations. [arXiv] (2020), 52 pages.
  • Jentzen, A., Welti, T., Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation. [arXiv] (2020), 51 pages.
  • Beck, C., Gonon, L., Jentzen, A., Overcoming the curse of dimensionality in the numerical approximation of high-dimensional semilinear elliptic partial differential equations. [arXiv] (2020), 50 pages.
  • Jentzen, A., Kuckuck, B., Mueller-Gronbach, T., Yaroslavtseva, L., Counterexamples to local Lipschitz and local Hölder continuity with respect to the initial values for additive noise driven SDEs with smooth drift coefficient functions with at most polynomially growing derivatives. [arXiv] (2020), 27 pages.
  • Giles, M. B., Jentzen, A., Welti, T., Generalised multilevel Picard approximations. [arXiv] (2019), 61 pages.
  • Hutzenthaler, M., Jentzen, A., Lindner, F., Pusnik, P., Strong convergence rates on the whole probability space for space-time discrete numerical approximation schemes for stochastic Burgers equations. [arXiv] (2019), 60 pages.
  • Beck, C., Jentzen, A., Kuckuck, B., Full error analysis for the training of deep neural networks. [arXiv] (2019), 43 pages.
  • Grohs, P., Hornung, F., Jentzen, A., Zimmermann, P., Space-time error estimates for deep neural network approximations for differential equations. [arXiv] (2019), 86 pages.
  • Beck, C., Becker, S., Cheridito, P., Jentzen, A., Neufeld, A., Deep splitting method for parabolic PDEs. [arXiv] (2019), 40 pages.
  • Jentzen, A., Kuckuck, B., Mueller-Gronbach, T., Yaroslavtseva, L., On the strong regularity of degenerate additive noise driven stochastic differential equations with respect to their initial values. [arXiv] (2019), 59 pages.
  • Beccari, M., Hutzenthaler, M., Jentzen, A., Kurniawan, R., Lindner, F., Salimova, D., Strong and weak divergence of exponential and linear-implicit Euler approximations for stochastic partial differential equations with superlinearly growing nonlinearities. [arXiv] (2019), 65 pages.
  • Cox, S., Jentzen, A., Lindner, F., Weak convergence rates for temporal numerical approximations of stochastic wave equations with multiplicative noise. [arXiv] (2019), 51 pages.
  • Hudde, A., Hutzenthaler, M., Jentzen, A., Mazzonetto, S., On the Itô-Alekseev-Gröbner formula for stochastic differential equations. [arXiv] (2018), 29 pages.
  • Beck, C., Becker, S., Grohs, P., Jaafari, N., Jentzen, A., Solving stochastic differential equations and Kolmogorov equations by means of deep learning. [arXiv] (2018), 56 pages.
  • Becker, S., Gess, B., Jentzen, A., Kloeden, P. E., Strong convergence rates for explicit space-time discrete numerical approximations of stochastic Allen-Cahn equations. [arXiv] (2017), 104 pages.
  • Hefter, M., Jentzen, A., and Kurniawan, R., Weak convergence rates for numerical approximations of stochastic partial differential equations with nonlinear diffusion coefficients in UMD Banach spaces. [arXiv] (2016), 51 pages.
  • Hutzenthaler, M., Jentzen, A. and Noll, M., Strong convergence rates and temporal regularity for Cox-Ingersoll-Ross processes and Bessel processes with accessible boundaries. [arXiv] (2014), 32 pages.
  • Hefter, M., Jentzen, A., Kurniawan, R., Counterexamples to regularities for the derivative processes associated to stochastic evolution equations. [arXiv] (2017), 26 pages. Revision requested from Stoch. Partial Differ. Equ. Anal. Comput.

Publications and accepted research articles

  • Becker, S., Cheridito, P., Jentzen, A., Welti, T., Solving high-dimensional optimal stopping problems using deep learning. [arXiv] (2019), 42 pages. Accepted in European Journal of Applied Mathematics.
  • Jentzen, A., Lindner, F., Pusnik, P., Spatial Sobolev regularity for stochastic Burgers equations with additive trace class noise. [arXiv] (2019), 54 pages. Accepted in Nonlinear Analysis.
  • Beck, C., Gonon, L., Hutzenthaler, M., Jentzen, A., On existence and uniqueness properties for solutions of stochastic fixed point equations. [arXiv] (2019), 33 pages. Accepted in DCDS-B.
  • Grohs, P., Jentzen, A., Salimova, D., Deep neural network approximations for Monte Carlo algorithms. [arXiv] (2019), 45 pages. Accepted in SN Partial Differential Equations and Applications.
  • Beck, C., Hutzenthaler, M., Jentzen, A., On nonlinear Feynman-Kac formulas for viscosity solutions of semilinear parabolic partial differential equations. [arXiv] (2020), 54 pages. Accepted in Stochastics and Dynamics.
  • Becker, S., Braunwarth, R., Hutzenthaler, M., Jentzen, A., von Wurstemberger, P., Numerical simulations for full history recursive multilevel Picard approximations for systems of high-dimensional partial differential equations. [arXiv] (2020), 21 pages. Accepted in Communications in Computational Physics.
  • Beck, C., Hornung, F., Hutzenthaler, M., Jentzen, A., Kruse, T., Overcoming the curse of dimensionality in the numerical approximation of Allen-Cahn partial differential equations via truncated full-history recursive multilevel Picard approximations. [arXiv] (2019), 30 pages. Accepted in J. Numer. Math.
  • Berner, J., Elbraechter, D., Grohs, P., Jentzen, A., Towards a regularity theory for ReLU networks -- chain rule and global error estimates. Sampling Theory and Applications 2019. [arXiv] (2019), 5 pages.
  • Elbraechter, D., Grohs, P., Jentzen, A., Schwab, C., DNN Expression Rate Analysis of High-dimensional PDEs: Application to Option Pricing. [arXiv] (2018), 50 pages. Accepted in Constr. Approx.
  • Jentzen, A., Salimova, D., Welti, T., A proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients. [arXiv] (2018), 48 pages. Accepted in Comm. Math. Sci.
  • Becker, S., Cheridito, P., Jentzen, A., Pricing and hedging American-style options with deep learning. [arXiv] (2019), 12 pages. Accepted in J. Risk Financial Manag.
  • Cheridito, P., Jentzen, A., Rossmannek, F., Efficient approximation of high-dimensional functions with deep neural networks. [arXiv] (2019), 19 pages. Accepted in IEEE Transactions on Neural Networks and Learning Systems.
  • Hutzenthaler, M., Jentzen, A., Kruse, T., Overcoming the curse of dimensionality in the numerical approximation of parabolic partial differential equations with gradient-dependent nonlinearities. [arXiv] (2019), 33 pages. Accepted in Found. Comp. Math.
  • Gonon, L., Grohs, P., Jentzen, A., Kofler, D., Siska, D., Uniform error estimates for artificial neural network approximations for heat equations. [arXiv] (2019), 70 pages. Accepted in IMA J. Num. Anal.
  • Andersson, A., Jentzen, A., and Kurniawan, R., Existence, uniqueness, and regularity for stochastic evolution equations with irregular initial values. [arXiv] (2015), 31 pages. Accepted in J. Math. Anal. Appl..
  • Berner, J., Grohs, P., Jentzen, A., Analysis of the generalization error: Empirical risk minimization over deep artificial neural networks overcomes the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations. [arXiv] (2018), 35 pages.
  • Hutzenthaler, M., Jentzen, A., Kruse, T., Nguyen, T. A., von Wurstemberger, P., Overcoming the curse of dimensionality in the numerical approximation of semilinear parabolic partial differential equations. [arXiv] (2018), 30 pages. Accepted in Proc. Roy. Soc. A.
  • E, W., Hutzenthaler, M., Jentzen, A., and Kruse, T., Multilevel Picard iterations for solving smooth semilinear parabolic heat equations. [arXiv] (2017), 18 pages. Accepted in SN Partial Differential Equations and Applications.
  • Cox, S., Hutzenthaler, M. and Jentzen, A., Local Lipschitz continuity in the initial value and strong completeness for nonlinear stochastic differential equations. [arXiv] (2013), 54 pages. Accepted in Memoires of the American Mathematical Society.
  • Fehrman, B., Gess, B., Jentzen, A., Convergence rates for the stochastic gradient descent method for non-convex objective functions. [arXiv] (2019), 52 pages. Accepted in J. Mach. Learn. Res.
  • Hutzenthaler, M., Jentzen, A., von Wurstemberger, P., Overcoming the curse of dimensionality in the approximative pricing of financial derivatives with default risks. [arXiv] (2019), 71 pages. Accepted in Electronic Journal of Probability
  • Jentzen, A., Lindner, F., Pusnik, P., Exponential moment bounds and strong convergence rates for tamed-truncated numerical approximations of stochastic convolutions. [arXiv] (2018), 25 pages. Accepted in Numerical Algorithms.
  • Jentzen, A., Kuckuck, B., Neufeld, A., von Wurstemberger, P., Strong error analysis for stochastic gradient descent optimization algorithms. [arXiv] (2018), 75 pages. Accepted in IMA J. Num. Anal.
  • Cox, S., Hutzenthaler, M., Jentzen, A., van Neerven, J., and Welti, T., Convergence in Hölder norms with applications to Monte Carlo methods in infinite dimensions. [arXiv] (2016), 38 pages. Accepted in IMA J. Num. Anal.
  • Jacobe de Naurois, L., Jentzen, A., and Welti, T., Weak convergence rates for spatial spectral Galerkin approximations of semilinear stochastic wave equations with multiplicative noise. [arXiv] (2015), 27 pages. Accepted in Appl. Math. Optim.
  • Grohs, P., Hornung, F., Jentzen, A., von Wurstemberger, P., A proof that artificial neural networks overcome the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations. [arXiv] (2018), 124 pages. Accepted in Mem. Amer. Math. Soc..
  • Cheridito, P., Jentzen, A., Rossmannek, F., Non-convergence of stochastic gradient descent in the training of deep neural networks. [arXiv] (2020), 12 pages. Accepted in J. Complexity.
  • Becker, S., Gess, B., Jentzen, A., Kloeden, P. E., Lower and upper bounds for strong approximation errors for numerical approximations of stochastic heat equations. BIT Numerical Mathematics ? (2020). [arXiv].
  • Hutzenthaler, M., Jentzen, A., Kruse, T., Nguyen, T. A., A proof that rectified deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear heat equations. Springer Nat. Part. Diff. Equ. Appl. ? (2020). [arXiv].
  • Jentzen, A. and Kurniawan, R., Weak convergence rates for Euler-type approximations of semilinear stochastic evolution equations with nonlinear diffusion coefficients. Found. Comp. Math. ? (2020). [arXiv].
  • Hutzenthaler, M. and Jentzen, A., On a perturbation theory and on strong convergence rates for stochastic ordinary and partial differential equations with non-globally monotone coefficients. The Annals of Probability 48 (2020), 53-93. [arXiv].
  • Jentzen, A. and Pusnik, P., Strong convergence rates for an explicit numerical approximation method for stochastic evolution equations with non-globally Lipschitz continuous nonlinearities. IMA J. Numer. Anal. 40 (2020), 1005-1050. [arXiv].
  • Jentzen, A., von Wurstemberger, P., Lower error bounds for the stochastic gradient descent optimization algorithm: Sharp convergence rates for slowly and fast decaying learning rates. J. Complexity 57 (2020), 101438. [arXiv].
  • Da Prato, G., Jentzen, A. and Röckner, M., A mild Ito formula for SPDEs. Trans. Amer. Math. Soc. 372 (2019). [arXiv].
  • Beck, C., E, W., Jentzen, A., Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations. J. Nonlinear Sci. 29 (2019), 1563-1619. [arXiv].
  • Jentzen, A., Lindner, F., Pusnik, P., On the Alekseev-Gröbner formula in Banach spaces. Discrete Contin. Dyn. Syst. Ser. B 24 (2019), 4475-4511. [arXiv].
  • Becker, S., Cheridito, P., Jentzen, A., Deep optimal stopping. J. Mach. Learn. Res. 20 (2019), 1-25. [arXiv].
  • E, W., Hutzenthaler, M., Jentzen, A., Kruse, T., On multilevel Picard numerical approximations for high-dimensional nonlinear parabolic partial differential equations and high-dimensional nonlinear backward stochastic differential equations. J. Sci. Comput. 79 (2019), 1534-1571. [arXiv].
  • Andersson, A., Hefter, M., Jentzen, A., and Kurniawan, R., Regularity properties for solutions of infinite dimensional Kolmogorov equations in Hilbert spaces. Potential Analysis 50 (2019), 347-379. [arXiv].
  • Conus, D., Jentzen, A. and Kurniawan, R., Weak convergence rates of spectral Galerkin approximations for SPDEs with nonlinear diffusion coefficients. Ann. Appl. Probab. 29 (2019), 653-716. [arXiv].
  • Becker, S. and Jentzen, A., Strong convergence rates for nonlinearity-truncated Euler-type approximations of stochastic Ginzburg-Landau equations. Stochastic Process. Appl. 129 (2018), 28-69. [arXiv].
  • Hefter, M., Jentzen, A., On arbitrarily slow convergence rates for strong numerical approximations of Cox-Ingersoll-Ross processes and squared Bessel processes. Finance Stoch. 23 (2019), 139-172. [arXiv].
  • Jentzen, A., Salimova, D., Welti, T., Strong convergence for explicit space-time discrete numerical approximation methods for stochastic Burgers equations. J. Math. Anal. Appl. 469 (2019), 661-704. [arXiv].
  • Hutzenthaler, M., Jentzen, A., Salimova, D., Strong convergence of full-discrete nonlinearity-truncated accelerated exponential Euler-type approximations for stochastic Kuramoto-Sivashinsky equations. Comm. Math. Sci. 16 (2018), 1489-1529. [arXiv].
  • Cox, S., Jentzen, A., Kurniawan, R., and Pusnik, P., On the mild Ito formula in Banach spaces. Discrete Contin. Dyn. Syst. Ser. B. 23 (2018), 2217-2243. [arXiv].
  • Jentzen, A. and Pusnik, P., Exponential moments for numerical approximations of stochastic partial differential equations. SPDE: Anal. and Comp. 6 (2018), 565-617. [arXiv].
  • Han, J., Jentzen, A., E, W., Solving high-dimensional partial differential equations using deep learning. Proc. Natl. Acad. Sci. 115 (2018), 8505-8510. [arXiv].
  • Jacobe de Naurois, L., Jentzen, A., and Welti, T., Lower bounds for weak approximation errors for spatial spectral Galerkin approximations of stochastic wave equations. Stochastic partial differential equations and related fields, 237-248, Springer Proc. Math. Stat., 229, Springer, Cham, 2018. [arXiv].
  • Hutzenthaler, M., Jentzen, A. and Wang, X., Exponential integrability properties of numerical approximation processes for nonlinear stochastic differential equations. Math. Comp. 87 (2018), 1353-1413. [arXiv].
  • Gerencsér, M., Jentzen, A., and Salimova, D., On stochastic differential equations with arbitrarily slow convergence rates for strong approximation in two space dimensions. Proc. Roy. Soc. London A 473 (2017). [arXiv].
  • E, W., Han, J., Jentzen, A., Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations. Commun. Math. Stat. 5 (2017), 349-380. [arXiv].
  • Andersson, A., Jentzen, A., Kurniawan, R., and Welti, T., On the differentiability of solutions of stochastic evolution equations with respect to their initial values. Nonlinear Analysis 162 (2017), 128-161. [arXiv].
  • Jentzen, A., Müller-Gronbach, T., and Yaroslavtseva, L., On stochastic differential equations with arbitrary slow convergence rates for strong approximation. Commun. Math. Sci. 14 (2016), no. 6, 1477-1500. [arXiv].
  • Becker, S., Jentzen, A. and Kloeden, P. E., An exponential Wagner-Platen type scheme for SPDEs. SIAM J. Numer. Anal. 54 (2016), no. 4, 2389-2426. [arXiv].
  • E, W., Jentzen, A. and Shen, H., Renormalized powers of Ornstein-Uhlenbeck processes and well-posedness of stochastic Ginzburg-Landau equations. Nonlinear Anal. 142 (2016), no. 142, 152-193. [arXiv].
  • Hutzenthaler, M. and Jentzen, A., Numerical approximations of stochastic differential equations with non-globally Lipschitz continuous coefficients. Mem. Amer. Math. Soc. 236 (2015), no. 1112, 99 pages. [arXiv].
  • Jentzen, A. and Röckner, M., A Milstein scheme for SPDEs. Found. Comput. Math. 15 (2015), no. 2, 313-362. [arXiv].
  • Hairer, M., Hutzenthaler, M. and Jentzen, A., Loss of regularity for Kolmogorov equations. Ann. Probab. 43 (2015), no. 2, 468-527. [arXiv].
  • Hutzenthaler, M., Jentzen, A. and Kloeden, P. E., Divergence of the multilevel Monte Carlo Euler method for nonlinear stochastic differential equations. Ann. Appl. Probab. 23 (2013), no. 5, 1913-1966. [arXiv].
  • Blömker, D. and Jentzen, A., Galerkin approximations for the stochastic Burgers equation. SIAM J. Numer. Anal. 51 (2013), no. 1, 694-715. [arXiv].
  • Hutzenthaler, M., Jentzen, A. and Kloeden, P. E., Strong convergence of an explicit numerical method for SDEs with non-globally Lipschitz continuous coefficients. Ann. Appl. Probab. 22 (2012), no. 4, 1611-1641. [arXiv].
  • Jentzen, A. and Röckner, M., Regularity analysis for stochastic partial differential equations with nonlinear multiplicative trace class noise. J. Differential Equations 252 (2012), no. 1, 114-136. [arXiv].
  • Hutzenthaler, M. and Jentzen, A., Convergence of the stochastic Euler scheme for locally Lipschitz coefficients. Found. Comput. Math. 11 (2011), no. 6, 657-706. [arXiv].
  • Jentzen, A. and Kloeden, P. E., Taylor Approximations for Stochastic Partial Differential Equations. CBMS-NSF Regional Conference Series in Applied Mathematics 83, Society for Industrial and Applied Mathematics (SIAM), Philadelphia, PA, 2011. xiv+211 pp.
  • Jentzen, A., Kloeden, P. E. and Winkel, G., Efficient simulation of nonlinear parabolic SPDEs with additive noise. Ann. Appl. Probab. 21 (2011), no. 3, 908-950. [arXiv].
  • Hutzenthaler, M., Jentzen, A. and Kloeden, P. E., Strong and weak divergence in finite time of Euler's method for stochastic differential equations with non-globally Lipschitz continuous coefficients. Proc. R. Soc. A 467 (2011), no. 2130, 1563-1576. [arXiv].
  • Jentzen, A., Higher order pathwise numerical approximations of SPDEs with additive noise. SIAM J. Numer. Anal. 49 (2011), no. 2, 642-667.
  • Jentzen, A., Taylor expansions of solutions of stochastic partial differential equations. Discrete Contin. Dyn. Syst. Ser. B 14 (2010), no. 2, 515-557. [arXiv].
  • Jentzen, A. and Kloeden, P. E., Taylor expansions of solutions of stochastic partial differential equations with additive noise. Ann. Probab. 38 (2010), no. 2, 532-569. [arXiv].
  • Jentzen, A., Leber, F., Schneisgen, D., Berger, A. and Siegmund., S., An improved maximum allowable transfer interval for Lp-stability of networked control systems. IEEE Trans. Automat. Control 55 (2010), no. 1, 179-184.
  • Jentzen, A. and Kloeden, P. E., A unified existence and uniqueness theorem for stochastic evolution equations. Bull. Aust. Math. Soc. 81 (2010), no. 1, 33-46.
  • Jentzen, A. and Kloeden, P. E., The numerical approximation of stochastic partial differential equations. Milan J. Math. 77 (2009), no. 1, 205-244.
  • Jentzen, A., Kloeden, P. E. and Neuenkirch, A., Pathwise convergence of numerical schemes for random and stochastic differential equations. Foundations of Computational Mathematics, Hong Kong 2008, 140-161, London Mathematical Society Lecture Note Series, 363, Cambridge University Press, Cambridge, 2009.
  • Jentzen, A., Pathwise numerical approximations of SPDEs with additive noise under non-global Lipschitz coefficients. Potential Anal. 31 (2009), no. 4, 375-404.
  • Jentzen, A. and Kloeden, P. E., Pathwise Taylor schemes for random ordinary differential equations. BIT 49 (2009), no. 1, 113-140.
  • Jentzen, A., Kloeden, P. E. and Neuenkirch, A., Pathwise approximation of stochastic differential equations on domains: higher order convergence rates without global Lipschitz coefficients. Numer. Math. 112 (2009), no. 1, 41-64.
  • Jentzen, A. and Neuenkirch, A., A random Euler scheme for Carathéodory differential equations. J. Comput. Appl. Math. 224 (2009), no. 1, 346-359.
  • Jentzen, A. and Kloeden, P. E., Overcoming the order barrier in the numerical approximation of stochastic partial differential equations with additive space-time noise. Proc. R. Soc. A 465 (2009), no. 2102, 649-667.
  • Kloeden, P. E. and Jentzen, A., Pathwise convergent higher order numerical schemes for random ordinary differential equations. Proc. R. Soc. A 463 (2007), no. 2087, 2929-2944.

Theses

  • Jentzen, A., Taylor Expansions for Stochastic Partial Differential Equations. PhD thesis (2009), Frankfurt University, Germany.
  • Jentzen, A., Numerische Verfahren hoher Ordnung für zufällige Differentialgleichungen. Diploma thesis (2007), Frankfurt University, Germany.