Seminar: Tensor Decompositions for Probabilistic Modeling & Inference

Overview

Many AI applications work with data in the form of tensors or with models including tensor structures. Here, tensor decompositions are highly effective to efficiently handle large tensors, specifically enabling more compact representations as well as more efficient calculations. Since discrete joint probability distributions can be naturally interpreted as tensors, we investigate in this seminar how tensor decompositions and related concepts can be used for (more efficient) probabilistic modeling & inference. The seminar focuses in particular on probabilistic graphical models.

This seminar will be held in English.

For more information, please register in the corresponding Learnweb course.

Requirements

  1. Topic Selection
    • Depending on the number of participants, alone or in teams
    • Literature search
  2. Present two talks, each around 30 minutes (plus discussion)
    • First talk on basics
    • Based on one or two papers, the second talk will deal with the topic with a specific focus
  3. Compile an invidiual written report
    • Around 7 +- 1 pages in the IJCAI format (double column) without references
    • Description of the concepts from the talks
    • Including results of literature search
  4. Attendance during all presentations, participation during discussions

Topics

  • Duality Probabilistic Graphical Models and Tensor Networks
  • Efficient / Optimised Tensor Network Contraction
  • CP Decomposition for Probabilistic Inference
  • Tensor Decomposition for Parameter-Learning in Probabilistic Graphical Models