Projects

Research area C: Models and Approximations
Unit C1: Evolution and asymptotics
Unit C4: Geometry-based modelling, approximation, and reduction

Further Projects
Dynamical systems and irregular gradient flows online
Mathematical Theory for Deep Learning online
Existence, uniqueness, and regularity properties of solutions of partial differential equations online
Regularity properties and approximations for stochastic ordinary and partial differential equations with non-globally Lipschitz continuous nonlinearities online
Overcoming the curse of dimensionality: stochastic algorithms for high-dimensional partial differential equations online

Research Interests

Research Interests

$\bullet$ Mathematics for machine learning
$\bullet$ Numerical approximations for high-dimensional partial differential equations
$\bullet$ Numerical approximations for stochastic differential equations
$\bullet$ Deep Learning

Selected Publications

Hutzenthaler M, Jentzen A, Kloeden PE Strong and weak divergence in finite time of Euler's method for stochastic differential equations with non-globally Lipschitz continuous coefficients. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences Vol. 467 (2130), 2011, pp 1563-1576 online
Hairer M, Hutzenthaler M, Jentzen A Loss of regularity for Kolmogorov equations. Annals of Probability Vol. 43 (2), 2015, pp 468-527 online
Hutzenthaler M, Jentzen A Numerical approximations of stochastic differential equations with non-globally Lipschitz continuous coefficients. Memoirs of the American Mathematical Society Vol. 236 (1112), 2015, pp v+99 online
E W, Han J, Jentzen A Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations. Communications in Mathematics and Statistics Vol. 5 (4), 2017, pp 349-380 online
Han J, Jentzen A, E W Solving high-dimensional partial differential equations using deep learning. Proceedings of the National Academy of Sciences of the United States of America Vol. 115 (34), 2018, pp 8505-8510 online
Hutzenthaler M, Jentzen A, Kloeden PE Strong convergence of an explicit numerical method for SDEs with nonglobally Lipschitz continuous coefficients. Annals of Applied Probability Vol. 22 (4), 2012, pp 1611-1641 online
Fehrman B, Gess B, Jentzen A Convergence rates for the stochastic gradient descent method for non-convex objective functions. Journal of Machine Learning Research Vol. 21, 2020, pp Paper No. 136, 48 online
Hutzenthaler M, Jentzen A, Kruse T, Nguyen TA, von Wurstemberger P Overcoming the curse of dimensionality in the numerical approximation of semilinear parabolic partial differential equations. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences Vol. 476 (2244), 2020, pp 630-654 online
Hutzenthaler M, Jentzen A On a perturbation theory and on strong convergence rates for stochastic ordinary and partial differential equations with nonglobally monotone coefficients. Annals of Probability Vol. 48 (1), 2020, pp 53-93 online

Current Publications

Current Publications

$\bullet$ Steffen Dereich, Arnulf Jentzen, and Sebastian Kassing. On the existence of minimizers in shallow residual ReLU neural network optimization landscapes. arXiv e-prints, February 2023. arXiv:2302.14690.

$\bullet$ Arnulf Jentzen, Adrian Riekert, and Philippe von Wurstemberger. Algorithmically designed artificial neural networks (ADANNs): Higher order deep operator learning for parametric partial differential equations. arXiv e-prints, February 2023. arXiv:2302.03286.

$\bullet$ Arnulf Jentzen and Adrian Riekert. Convergence analysis for gradient flows in the training of artificial neural networks with ReLU activation. J. Math. Anal. Appl., 517(2):126601, January 2023. doi:10.1016/j.jmaa.2022.126601.

$\bullet$ Philipp Grohs, Fabian Hornung, Arnulf Jentzen, and Philipp Zimmermann. Space-time error estimates for deep neural network approximations for differential equations. Advances in Computational Mathematics, 49(1):4, January 2023. doi:10.1007/s10444-022-09970-2.

$\bullet$ Lukas Gonon, Robin Graeber, and Arnulf Jentzen. The necessity of depth for artificial neural networks to approximate certain classes of smooth and bounded functions without the curse of dimensionality. arXiv e-prints, January 2023. arXiv:2301.08284.

$\bullet$ Shokhrukh Ibragimov, Arnulf Jentzen, and Adrian Riekert. Convergence to good non-optimal critical points in the training of neural networks: Gradient descent optimization with one random initialization overcomes all bad non-global local minima with high probability. arXiv e-prints, December 2022. arXiv:2212.13111.

$\bullet$ Davide Gallon, Arnulf Jentzen, and Felix Lindner. Blow up phenomena for gradient descent optimization methods in the training of artificial neural networks. arXiv e-prints, November 2022. arXiv:2211.15641.

$\bullet$ Christian Beck, Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, and Ariel Neufeld. An efficient Monte Carlo scheme for Zakai equations. arXiv e-prints, October 2022. arXiv:2210.13530.

$\bullet$ Arnulf Jentzen and Adrian Riekert. A proof of convergence for stochastic gradient descent in the training of artificial neural networks with ReLU activation for constant target functions. Z. Angew. Math. Phys., 73(5):188, August 2022. doi:10.1007/s00033-022-01716-w.

$\bullet$ Martin Hutzenthaler, Arnulf Jentzen, and Thomas Kruse. Overcoming the curse of dimensionality in the numerical approximation of parabolic partial differential equations with gradient-dependent nonlinearities. Found. Comput. Math., 22(4):905–966, August 2022. doi:10.1007/s10208-021-09514-y.

$\bullet$ Patrick Cheridito, Arnulf Jentzen, and Florian Rossmannek. Gradient descent provably escapes saddle points in the training of shallow ReLU networks. arXiv e-prints, August 2022. arXiv:2208.02083.

$\bullet$ Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse, and Tuan Anh Nguyen. Overcoming the curse of dimensionality in the numerical approximation of backward stochastic differential equations. Journal of Numerical Mathematics, July 2022. doi:10.1515/jnma-2021-0111.

$\bullet$ Lukas Gonon, Philipp Grohs, Arnulf Jentzen, David Kofler, and David Šiška. Uniform error estimates for artificial neural network approximations for heat equations. IMA J. Numer. Anal., 42(3):1991–2054, July 2022. doi:10.1093/imanum/drab027.

$\bullet$ Patrick Cheridito, Arnulf Jentzen, and Florian Rossmannek. Efficient approximation of high-dimensional functions with neural networks. IEEE Trans. Neural Netw. Learn. Syst., 33(7):3079–3093, July 2022. doi:10.1109/TNNLS.2021.3049719.

$\bullet$ Patrick Cheridito, Arnulf Jentzen, and Florian Rossmannek. Landscape analysis for shallow neural networks: Complete classification of critical points for affine target functions. J. Nonlinear Sci., 32(5):64, July 2022. doi:10.1007/s00332-022-09823-8.

$\bullet$ Arnulf Jentzen, Benno Kuckuck, Thomas Müller-Gronbach, and Larisa Yaroslavtseva. Counterexamples to local Lipschitz and local Hölder continuity with respect to the initial values for additive noise driven stochastic differential equations with smooth drift coefficient functions with at most polynomially growing derivatives. Discrete Contin. Din. Syst. Ser. B, 27(7):3707, July 2022. doi:10.3934/dcdsb.2021203.

$\bullet$ Simon Eberle, Arnulf Jentzen, Adrian Riekert, and Georg Weiss. Normalized gradient flow optimization in the training of ReLU artificial neural networks. arXiv e-prints, July 2022. arXiv:2207.06246.

$\bullet$ Arnulf Jentzen and Timo Kröger. On bounds for norms of reparameterized ReLU artificial neural network parameters: sums of fractional powers of the Lipschitz norm control the network parameter vector. arXiv e-prints, June 2022. arXiv:2206.13646.

$\bullet$ Christian Beck, Arnulf Jentzen, and Benno Kuckuck. Full error analysis for the training of deep neural networks. Infin. Dimens. Anal. Quantum Probab. Relat. Top., 25(02):2150020, June 2022. doi:10.1142/S021902572150020X.

$\bullet$ Philipp Grohs, Arnulf Jentzen, and Diyora Salimova. Deep neural network approximations for solutions of PDEs based on Monte Carlo algorithms. Partial Differ. Equ. Appl., 3(4):Paper No. 45, 41, June 2022. doi:10.1007/s42985-021-00100-z.

$\bullet$ Victor Boussange, Sebastian Becker, Arnulf Jentzen, Benno Kuckuck, and Loïc Pellissier. Deep learning approximations for non-local nonlinear PDEs with Neumann boundary conditions. arXiv e-prints, May 2022. arXiv:2205.03672.

$\bullet$ Sebastian Becker, Benjamin Gess, Arnulf Jentzen, and Peter E. Kloeden. Strong convergence rates for explicit space-time discrete numerical approximations of stochastic Allen-Cahn equations. Stochastics and Partial Differential Equations: Analysis and Computations, April 2022. doi:10.1007/s40072-021-00226-6.

$\bullet$ Sebastian Becker, Arnulf Jentzen, Marvin S. Müller, and Philippe von Wurstemberger. Learning the random variables in Monte Carlo simulations with stochastic gradient descent: Machine learning for parametric PDEs and financial derivative pricing. arXiv e-prints, February 2022. arXiv:2202.02717.

$\bullet$ Shokhrukh Ibragimov, Arnulf Jentzen, Timo Kröger, and Adrian Riekert. On the existence of infinitely many realization functions of non-global local minima in the training of artificial neural networks with ReLU activation. arXiv e-prints, February 2022. arXiv:2202.11481.

$\bullet$ Martin Hutzenthaler, Arnulf Jentzen, Katharina Pohl, Adrian Riekert, and Luca Scarpa. Convergence proof for stochastic gradient descent in the training of deep neural networks with ReLU activation for constant target functions. arXiv e-prints, December 2021. arXiv:2112.07369.

$\bullet$ Weinan E, Jiequn Han, and Arnulf Jentzen. Algorithms for solving high dimensional PDEs: From nonlinear Monte Carlo to machine learning. Nonlinearity, 35(1):278–310, December 2021. doi:10.1088/1361-6544/ac337f.

$\bullet$ Arnulf Jentzen and Adrian Riekert. On the existence of global minima and convergence analyses for gradient descent methods in the training of deep neural networks. arXiv e-prints, December 2021. arXiv:2112.09684.

$\bullet$ Pierfrancesco Beneventano, Patrick Cheridito, Robin Graeber, Arnulf Jentzen, and Benno Kuckuck. Deep neural network approximation theory for high-dimensional functions. arXiv e-prints, December 2021. arXiv:2112.14523.

$\bullet$ Weinan E, Martin Hutzenthaler, Arnulf Jentzen, and Thomas Kruse. Multilevel Picard iterations for solving smooth semilinear parabolic heat equations. Partial Differ. Equ. Appl., 2:Paper No. 80, November 2021. doi:10.1007/s42985-021-00089-5.

$\bullet$ Ladislas Naurois, Arnulf Jentzen, and Timo Welti. Weak convergence rates for spatial spectral Galerkin approximations of semilinear stochastic wave equations with multiplicative noise. Appl. Math. Optim., 84:1187–1217, November 2021. doi:10.1007/s00245-020-09744-6.

$\bullet$ Martin Hutzenthaler, Arnulf Jentzen, Benno Kuckuck, and Joshua Lee Padgett. Strong $L^p$-error analysis of nonlinear Monte Carlo approximations for high-dimensional semilinear partial differential equations. arXiv e-prints, October 2021. arXiv:2110.08297.

$\bullet$ Arnulf Jentzen, Benno Kuckuck, Thomas Müller-Gronbach, and Larisa Yaroslavtseva. On the strong regularity of degenerate additive noise driven stochastic differential equations with respect to their initial values. J. Math. Anal. Appl., 502(2):125240, October 2021. doi:10.1016/j.jmaa.2021.125240.

$\bullet$ Arnulf Jentzen, Felix Lindner, and Primož Pušnik. Spatial Sobolev regularity for stochastic Burgers equations with additive trace class noise. Nonlinear Anal., 210:112310, September 2021. doi:10.1016/j.na.2021.112310.

$\bullet$ Christian Beck, Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, and Ariel Neufeld. Deep splitting method for parabolic PDEs. SIAM J. Sci. Comput., 43(5):A3135–A3154, September 2021. doi:10.1137/19M1297919.

$\bullet$ Christian Beck, Lukas Gonon, Martin Hutzenthaler, and Arnulf Jentzen. On existence and uniqueness properties for solutions of stochastic fixed point equations. Discrete Contin. Dyn. Syst. – B, 26(9):4927–4962, September 2021. doi:10.3934/dcdsb.2020320.

$\bullet$ Arnulf Jentzen and Adrian Riekert. A proof of convergence for the gradient descent optimization method with random initializations in the training of neural networks with ReLU activation for piecewise linear target functions. arXiv e-prints, August 2021. arXiv:2108.04620.

$\bullet$ Simon Eberle, Arnulf Jentzen, Adrian Riekert, and Georg S. Weiss. Existence, uniqueness, and convergence rates for gradient flows in the training of artificial neural networks with ReLU activation. arXiv e-prints, August 2021. arXiv:2108.08106.

$\bullet$ Christian Beck, Sebastian Becker, Philipp Grohs, Nor Jaafari, and Arnulf Jentzen. Solving the Kolmogorov PDE by means of deep learning. J. Sci. Comput., 88(3):Paper No. 73, 28, July 2021. doi:10.1007/s10915-021-01590-0.

$\bullet$ Arnulf Jentzen, Diyora Salimova, and Timo Welti. A proof that deep artificial neural networks overcome the curse of dimensionality in the numerical approximation of Kolmogorov partial differential equations with constant diffusion and nonlinear drift coefficients. Commun. Math. Sci., 19(5):1167–1205, July 2021. doi:10.4310/CMS.2021.v19.n5.a1.

$\bullet$ Christian Beck, Martin Hutzenthaler, and Arnulf Jentzen. On nonlinear Feynman-Kac formulas for viscosity solutions of semilinear parabolic partial differential equations. Stoch. Dyn., 21(8):2150048, July 2021. doi:10.1142/S0219493721500489.

$\bullet$ Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, and Timo Welti. Solving high-dimensional optimal stopping problems using deep learning. European J. Appl. Math., 32:470–514, June 2021. doi:10.1017/S0956792521000073.

$\bullet$ Patrick Cheridito, Arnulf Jentzen, and Florian Rossmannek. Non-convergence of stochastic gradient descent in the training of deep neural networks. J. Complex., 64:101540, June 2021. doi:10.1016/j.jco.2020.101540.

$\bullet$ Dennis Elbrächter, Philipp Grohs, Arnulf Jentzen, and Christoph Schwab. DNN expression rate analysis of high-dimensional PDEs: Application to option pricing. Constr. Approx., May 2021. doi:10.1007/s00365-021-09541-6.

$\bullet$ Patrick Cheridito, Arnulf Jentzen, and Florian Rossmannek. Landscape analysis for shallow ReLU neural networks: complete classification of critical points for affine target functions. arXiv e-prints, March 2021. arXiv:2103.10922.

$\bullet$ Philipp Grohs, Shokhrukh Ibragimov, Arnulf Jentzen, and Sarah Koppensteiner. Lower bounds for artificial neural network approximations: A proof that shallow neural networks fail to overcome the curse of dimensionality. arXiv e-prints, March 2021. arXiv:2103.04488.

$\bullet$ Christian Beck, Martin Hutzenthaler, Arnulf Jentzen, and Emilia Magnani. Full history recursive multilevel picard approximations for ordinary differential equations with expectations. arXiv e-prints, March 2021. arXiv:2103.02350.

$\bullet$ Sonja Cox, Martin Hutzenthaler, and Arnulf Jentzen. Local Lipschitz continuity in the initial value and strong completeness for nonlinear stochastic differential equations. arXiv e-prints, February 2021. arXiv:1309.5595.

$\bullet$ Arnulf Jentzen and Timo Kröger. Convergence rates for gradient descent in the training of overparameterized artificial neural networks with biases. arXiv e-prints, February 2021. arXiv:2102.11840.

$\bullet$ Patrick Cheridito, Arnulf Jentzen, Adrian Riekert, and Florian Rossmannek. A proof of convergence for gradient descent in the training of artificial neural networks for constant target functions. J. Complex., January 2021. doi:10.1016/j.jco.2022.101646.

$\bullet$ Sonja Cox, Martin Hutzenthaler, Arnulf Jentzen, Jan van Neerven, and Timo Welti. Convergence in Hölder norms with applications to Monte Carlo methods in infinite dimensions. IMA Journal of Numerical Analysis, 41(1):493–548, January 2021. doi:10.1093/imanum/drz063.

$\bullet$ Adam Andersson, Arnulf Jentzen, and Ryan Kurniawan. Existence, uniqueness, and regularity for stochastic evolution equations with irregular initial values. Journal of Mathematical Analysis and Applications, 495(1):124558, 33, January 2021. doi:10.1016/j.jmaa.2020.124558.

$\bullet$ Arnulf Jentzen, Benno Kuckuck, Ariel Neufeld, and Philippe von Wurstemberger. Strong error analysis for stochastic gradient descent optimization algorithms. IMA J. Numer. Anal., 41(1):455–492, January 2021. doi:10.1093/imanum/drz055.

$\bullet$ Pierfrancesco Beneventano, Patrick Cheridito, Arnulf Jentzen, and Philippe von Wurstemberger. High-dimensional approximation spaces of artificial neural networks and applications to partial differential equations. arXiv e-prints, December 2020. arXiv:2012.04326.

$\bullet$ Christian Beck, Fabian Hornung, Martin Hutzenthaler, Arnulf Jentzen, and Thomas Kruse. Overcoming the curse of dimensionality in the numerical approximation of Allen-Cahn partial differential equations via truncated full-history recursive multilevel Picard approximations. Journal of Numerical Mathematics, 28(4):197–222, December 2020. doi:10.1515/jnma-2019-0074.

$\bullet$ Arnulf Jentzen and Adrian Riekert. Strong overall error analysis for the training of artificial neural networks via random initializations. arXiv e-prints, December 2020. arXiv:2012.08443.

$\bullet$ Arnulf Jentzen, Felix Lindner, and Primož Pušnik. Exponential moment bounds and strong convergence rates for tamed-truncated numerical approximations of stochastic convolutions. Numer. Algorithms, 85(4):1447–1473, December 2020. doi:10.1007/s11075-019-00871-y.

$\bullet$ Christian Beck, Martin Hutzenthaler, Arnulf Jentzen, and Benno Kuckuck. An overview on deep learning-based approximation methods for partial differential equations. arXiv e-prints, December 2020. arXiv:2012.12348.

$\bullet$ Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse, Tuan Anh Nguyen, and Philippe von Wurstemberger. Overcoming the curse of dimensionality in the numerical approximation of semilinear parabolic partial differential equations. Proc. A., 476(2244):20190630, 25, December 2020. doi:10.1098/rspa.2019.0630.

$\bullet$ Christian Beck, Sebastian Becker, Patrick Cheridito, Arnulf Jentzen, and Ariel Neufeld. Deep learning based numerical approximation algorithms for stochastic partial differential equations and high-dimensional nonlinear filtering problems. arXiv e-prints, December 2020. arXiv:2012.01194.

$\bullet$ Sebastian Becker, Ramon Braunwarth, Martin Hutzenthaler, Arnulf Jentzen, and Philippe von Wurstemberger. Numerical simulations for full history recursive multilevel Picard approximations for systems of high-dimensional partial differential equations. Communications in Computational Physics, 28(5):2109–2138, November 2020. doi:10.4208/cicp.OA-2020-0130.

$\bullet$ Christian Beck, Arnulf Jentzen, and Thomas Kruse. Nonlinear Monte Carlo methods with polynomial runtime for high-dimensional iterated nested expectations. arXiv e-prints, September 2020. arXiv:2009.13989.

$\bullet$ Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse, and Tuan Anh Nguyen. Multilevel Picard approximations for high-dimensional semilinear second-order PDEs with Lipschitz nonlinearities. arXiv e-prints, September 2020. arXiv:2009.02484.

$\bullet$ Martin Hutzenthaler, Arnulf Jentzen, and Philippe von Wurstemberger. Overcoming the curse of dimensionality in the approximative pricing of financial derivatives with default risks. Electron. J. Probab., 25:Paper No. 101, 73, August 2020. doi:10.1214/20-ejp423.

$\bullet$ Weinan E, Jiequn Han, and Arnulf Jentzen. Algorithms for solving high dimensional PDEs: from nonlinear Monte Carlo to machine learning. arXiv e-prints, August 2020. arXiv:2008.13333.

$\bullet$ Aritz Bercher, Lukas Gonon, Arnulf Jentzen, and Diyora Salimova. Weak error analysis for stochastic gradient descent optimization algorithms. arXiv e-prints, July 2020. arXiv:2007.02723.

$\bullet$ Sebastian Becker, Patrick Cheridito, and Arnulf Jentzen. Pricing and heding american-style options with deep learning. Journal of Risk and Financial Management, July 2020. doi:10.3390/jrfm13070158.

$\bullet$ Julius Berner, Philipp Grohs, and Arnulf Jentzen. Analysis of the generalization error: empirical risk minimization over deep artificial neural networks overcomes the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations. SIAM Journal on Mathematics of Data Science, 2(3):631–657, July 2020. doi:10.1137/19M125649X.

$\bullet$ Sebastian Becker, Benjamin Gess, Arnulf Jentzen, and Peter E. Kloeden. Lower and upper bounds for strong approximation errors for numerical approximations of stochastic heat equations. BIT, 60(4):1057–1073, June 2020. doi:10.1007/s10543-020-00807-2.

$\bullet$ Fabian Hornung, Arnulf Jentzen, and Diyora Salimova. Space-time deep neural network approximations for high-dimensional partial differential equations. arXiv e-prints, June 2020. arXiv:2006.02199.

$\bullet$ Benjamin Fehrman, Benjamin Gess, and Arnulf Jentzen. Convergence rates for the Stochastic Gradient Descent method for non-convex objective functions. Journal of Machine Learning Research, 21(136):1–48, June 2020. doi:10.1016/j.jco.2019.101438.

$\bullet$ Arnulf Jentzen and Ryan Kurniawan. Weak convergence rates for Euler-type approximations of semilinear stochastic evolution equations with nonlinear diffusion coefficients. Foundations of Computational Mathematics, May 2020. doi:10.1007/s10208-020-09448-x.

$\bullet$ Arnulf Jentzen and Primož Pušnik. Strong convergence rates for an explicit numerical approximation method for stochastic evolution equations with non-globally Lipschitz continuous nonlinearities. IMA Journal of Numerical Analysis, 40(2):1005–1050, April 2020. doi:10.1093/imanum/drz009.

$\bullet$ Martin Hutzenthaler, Arnulf Jentzen, Thomas Kruse, and Tuan Anh Nguyen. A proof that rectified deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear heat equations. SN Partial Differential Equations and Applications, 1(2):10, April 2020. doi:10.1007/s42985-019-0006-9.

$\bullet$ Arnulf Jentzen and Timo Welti. Overall error analysis for the training of deep neural networks via stochastic gradient descent with random initialisation. arXiv e-prints, March 2020. arXiv:2003.01291.

$\bullet$ Martin Hutzenthaler and Arnulf Jentzen. On a perturbation theory and on strong convergence rates for stochastic ordinary and partial differential equations with nonglobally monotone coefficients. Ann. Probab., 48(1):53–93, March 2020. doi:10.1214/19-AOP1345.

$\bullet$ Christian Beck, Lukas Gonon, and Arnulf Jentzen. Overcoming the curse of dimensionality in the numerical approximation of high-dimensional semilinear elliptic partial differential equations. arXiv e-prints, March 2020. arXiv:2003.00596v1.

$\bullet$ Arnulf Jentzen, Benno Kuckuck, Thomas Müller-Gronbach, and Larisa Yaroslavtseva. Counterexamples to local Lipschitz and local Hölder continuity with respect to the initial values for additive noise driven SDEs with smooth drift coefficient functions with at most polynomially growing derivatives. arXiv e-prints, January 2020. arXiv:2001.03472.

$\bullet$ Patrick Cheridito, Arnulf Jentzen, and Florian Rossmannek. Efficient approximation of high-dimensional functions with deep neural networks. arXiv e-prints, December 2019. arXiv:1912.04310.

$\bullet$ Martin Hutzenthaler, Arnulf Jentzen, and Thomas Kruse. Overcoming the curse of dimensionality in the numerical approximation of parabolic partial differential equations with gradient-dependent nonlinearities. arXiv e-prints, December 2019. arXiv:1912.02571.

$\bullet$ Michael B. Giles, Arnulf Jentzen, and Timo Welti. Generalised multilevel Picard approximations. arXiv e-prints, November 2019. arXiv:1911.03188.

$\bullet$ Martin Hutzenthaler, Arnulf Jentzen, Felix Lindner, and Primož Pušnik. Strong convergence rates on the whole probability space for space-time discrete numerical approximation schemes for stochastic Burgers equations. arXiv e-prints, November 2019. arXiv:1911.01870.

$\bullet$ Christian Beck, Weinan E, and Arnulf Jentzen. Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations. J. Nonlinear Sci., 29:1563–1619, August 2019. doi:10.1007/s00332-018-9525-3.

$\bullet$ Daniel Conus, Arnulf Jentzen, and Ryan Kurniawan. Weak convergence rates of spectral Galerkin approximations for SPDEs with nonlinear diffusion coefficients. Ann. Appl. Probab., 29(2):653–716, August 2019. doi:10.1214/17-AAP1352.

$\bullet$ Arnulf Jentzen, Uli Stadtmüller, and Robert Stelzer. Foreword: Special issue on “Stochastic differential equations, stochastic algorithms, and applications”. J. Math. Anal. Appl., 476(1):1, August 2019. doi:10.1016/j.jmaa.2019.03.069.

$\bullet$ Arnulf Jentzen, Felix Lindner, and Primož Pušnik. On the Alekseev-Gröbner formula in Banach spaces. Discrete Contin. Dyn. Syst. – B, 24:4475–4511, August 2019. doi:10.3934/dcdsb.2019128.

$\bullet$ Weinan E, Martin Hutzenthaler, Arnulf Jentzen, and Thomas Kruse. On multilevel Picard numerical approximations for high-dimensional nonlinear parabolic partial differential equations and high-dimensional nonlinear backward stochastic differential equations. Journal of Scientific Computing, 79(3):1534–1571, June 2019. doi:10.1007/s10915-018-00903-0.

$\bullet$ Giuseppe Da Prato, Arnulf Jentzen, and Michael Röckner. A mild Itô formula for SPDEs. Trans. Amer. Math. Soc., 372:3755–3807, June 2019. doi:10.1090/tran/7165.

$\bullet$ Adam Andersson, Mario Hefter, Arnulf Jentzen, and Ryan Kurniawan. Regularity properties for solutions of infinite dimensional Kolmogorov equations in Hilbert spaces. Potential Anal., 50:347–379, May 2019. doi:10.1007/s11118-018-9685-7.

$\bullet$ Sebastian Becker, Patrick Cheridito, and Arnulf Jentzen. Deep optimal stopping. J. Mach. Learn. Res., 20(74):1–25, April 2019.

$\bullet$ Sebastian Becker, Patrick Cheridito, and Arnulf Jentzen. Deep Optimal Stopping. J. Mach. Learn. Res., 20(74):1–25, April 2019. arXiv:1804.05394.

$\bullet$ Sebastian Becker and Arnulf Jentzen. Strong convergence rates for nonlinearity-truncated Euler-type approximations of stochastic Ginzburg-Landau equations. Stochastic Process. Appl., 129:28–69, January 2019. doi:10.1016/j.spa.2018.02.008.

$\bullet$ Mario Hefter and Arnulf Jentzen. On arbitrarily slow convergence rates for strong numerical approximations of Cox-Ingersoll-Ross processes and squared Bessel processes. Finance Stoch., 23:139–172, January 2019. doi:10.1007/s00780-018-0375-5.

$\bullet$ Arnulf Jentzen, Diyora Salimova, and Timo Welti. Strong convergence for explicit space-time discrete numerical approximation methods for stochastic Burgers equations. J. Math. Anal. App., 469:661–704, January 2019. doi:10.1016/j.jmaa.2018.09.032.