AI@WWU — A Practical Introduction to AI Theory and Techniques for Interdisciplinary Research


  • The course will be opend for members of the CRC1450 and CRC1459 groups and REACH incub.AI.tor participants.
  • For registration please send an email to (in case of a CRC1450 or CRC1459 membership) indicating your name, institute and group as well as your career level (first come, first serve with fair distribution among our CRC groups).
  • The Workshop will start on the 25.08. Log-in details will be send to the registered users directly.


© Nina Knubel

 Short Summary:

AI and in particular machine learning (ML) tools become more and more accessible due to easy to use programming environments (esp. Python) and libraries (esp. Tensorflow and Pytorch). In order to apply these powerful tools for a variety of research projects, some basic understanding is required to tackle data preparation, visualisation and successful ML algorithm usage. In this course we will (1) teach AI and machine learning basics (70% of the course) and (2) apply these techniques to custom problems and custom data provided by the participants (30% of the course). In particular, we will introduce several state of the art deep learning algorithms like CNNs, LSTMs and Autoencoder. The entire course will be interactive and the participants will implement and use all presented techniques in pre-configured test environments. To get in touch write an email to

  •  Goal


    •  Teach AI and machine learning basics (70%)
    •  Work on own data and project (30%)
  •  Targeted audience:

    Members of the CRC1450 (inSight), CRC1459 (intelligent matter) and REACH incub.AI.tor participants of all career levels (MSc, BSc, doctoral students, PostDocs, Group Leaders, etc.)

    • Participants should be interested in learning the basic theory and practical usage of AI (in particular deep learning) algorithms
    • All disciplines are welcome (i.e. science and humanities with interest in quantitative methods)
    • Own datasets / problem sets are welcome; if these can be used needs to be evaluated in the
      beginning of this course
  • Requirements

    •  Programming skills are not required; some basic experience is however desirable (preferably Python; programming language introduction including basics will be given in the course)
    • Basic knowledge in statistics and linear algebra
  • Syllabus

    1. Python basics
      • Programming language and environment (Jupyter Notebooks)
      • Importing, exporting and visualising data
      • Fundamental libraries (Numpy, Scikit-learn, Matplotlib)
    2. Machine Learning basics
      • Brief conceptual overview of different techniques (supervised, unsupervised etc.)
      • Introduction to classical machine learning techniques (SVM, Decision Trees, Clustering, …)
      • Perceptron and neural networks (loss, activation function, etc.)
      • Brief intro to Gradient Decent (Backpropagation)
      • Practical Example 1: Neural networks for regression (1D and 2D functions, under- and overfitting, etc.)
      • Practical Example 2: Neural networks for character recognition (MNIST)
    3. State-of-the-art (deep) Machine Learning concepts
      • Convolutional neural networks (CNNs)
        • Practical Example 3: CNNs for character recognition (MNIST)
        • Practical Example 4: CNNs for object classification (CIFAR-10)
      • Recurrent Neural Networks (especially LSTMs)
        • Practical Example 5: LSTMs for sequence learning
      • Autoencoder
        • Practical Example 6: Autoencoder for dimensionality reduction
    4. Advanced training concepts
      • Regularisation techniques (dropout, batch normalisation, data augmentation)
      • Transfer learning
    5. AI @ own dataset
      • Final Project: ML on own dataset


  • Organisation


    • 10 sessions á 3 hours (in English)
    • Presentations will be given via Zoom and the programming exercises will be made available online via JupyterHub. In addition, virtual conference rooms will be used to enable discussions and to provide assistance.
    • Preliminary timetable (each session starts at 4pm)
      1. 25.08.: Introduction and Python basics
      2. 01.09.: Machine Learning Theory on Neural Networks, Gradient Descent, ...
      3. 08.09.: Other Machine Learning Algorithms (SVM, Decision Trees, Random Forests, ...)
      4. 15.09.: CNNs, Dropout and Data-Augmentation
      5. 22.09.: Recap and an End-to-End-Project  (this event will be in presence)
      6. 29.09.: RNNs
      7. 06.10.: Autoencoder
      8. 20.10.: Advanced Concepts (Adverserial Attacks, Transfer Learning, ...)
      9. 27.10.: Final Remarks and own data
      10. 10.11.: Own data  (this event will be in presence)
    • Each Session would be 40-60 minutes presentation and ~ 2 hours of practical coding
    • Programming will be done in Python and Tensorflow2
    • Programming environment will be provided by pre-installed Jupyter Notebooks
      • Jupyter Notebooks allow interactive worksheets (only some code passages need to be filled in, the result is then plotted interactively)
    • Limited to 50 participants max.
    • For the two sessions in presence (22.09. and 10.11.) we will meet in the REACH (XLab Room 118). More informaiton about the REACH (location etc.) can be found here.
    • For tegistration see "IMPORTANT NOTES"
    • Contact:


  •  Literature


    • Practical Literature: “Hands-On Machine Learning with Scikit-Learn, Keras & TensorFlow” (2nd edition) by Aurelien Geron (primary resource for this course!)

    • Theoretical Literature (not required for this course):

      • “Deep Learning” (1st edition) by Ian Goodfellow, Yoshua Bengio and Aaron Courville
      • "Introduction to Linear Algebra" (5th edition) by Gilbert Strang
      • "Linear Algebra and Learning from Data" (1st edition) by Gilbert Strang
      • “Pattern Recognition and Machine Learning” (1st edition (corrected)) by Christopher Bishop
      • “Artificial Intelligence: A Modern Approach” (3rd edition) by Stuart Russel and Peter Norvig


© Uni MS