| Time, location: |
tba
|
| Learnweb: |
please sign up in https://sso.uni-muenster.de/LearnWeb/learnweb2/course/view.php?id=91273
|
| Content: |
Many problems from applications can be formulated as variational or optimization problem.
Often this also involves partial differential equations as constraints,
e.g. in optimal control of biological/chemical/physical/economic processes,
in the design of optimal devices for engineering applications,
in inverse problems of medicine and biology.
Another type of optimization is needed for the training of artificial neural networks with large data sets.
In this seminar we treat optimization methods for both contexts.
Based on interest and background talks will be on book chapters and research articles within the broad spectrum from neural nets to optimization under pde constraints.
|
| Prerequisites: |
Analysis I-III; a specialization in a module on one of the fields numerics, analysis, stochastics will be helpful.
|
| assessment: |
90-minute seminar talk and written report (ca. 7-page handout, to be presented and discussed with the lecturer ca. 10 days before the talk, to allow for improvements and further assistance)
|
| Organizational meeting: |
Wednesday, January 28, 2026, 13:30-14:15, Orleans-Ring 10, second floor, seminar room 120.030
|
| Participation: |
If you are interested, please attend the organizational meeting or contact us by e-mail.
|
| Topics: |
For optimization we might follow chapters in textbooks or a few selected articles in the context of optimizing artificial neural networks such as:
- Ulbrich, Ulbrich, Hinze, Pinnau: Optimization with PDE constraints
Table of contents available on the website -- to have a look inside let us know.
- Nocedal, Wright: Numerical Optimization
- Boyd, Vandenberghe: Convex Optimization
- Conn, Gould, Toint: Trust Region Methods
- Suh: Convex Optimization for Machine Learning
- Ruder: An overview of gradient descent optimization algorithms
- Kingma, Ba: ADAM: A Method for Stochastic Optimization
- Bottou: Stochastic Learning
- Dereich, Jentzen, Riekert: Sharp higher order convergence rates for the Adam optimizer
|