Skip to main content

Module description - Advanced Topics in Machine Learning
(Fortgeschrittene Themen in Machine Learning)

Number
fml
ECTS 2.0
Level Advanced
Content Machine Learning and especially Deep Learning have gained enormous attention in recent years. Significant improvements have been achieved in application areas that were traditionally considered easy for humans but difficult for machines - especially in computer vision tasks (e.g. object recognition) and natural language processing tasks (e.g. language translation or understanding).

The main goal here is to delve into some advanced and recent topics or issues in the Machine Learning and Deep Learning domain and understand the underlying concepts and ideas.

The learning outcomes formulated below are to be understood as examples and will be selected and adapted or extended according to topicality.
Learning outcomes Generative models
The students know the basic concepts underlying generative models - the different flavors, how they are built and trained, and what they can be used for. As examples of generative models they learn Generative Adversarial Networks (GAN), Variational Autoencoders (VAE) or Autoregressive Generative Models (PixelRNN , PixelCNN), know typical applications, can implement them in a suitable framework and understand the difficulties that can arise during training.

Object detection in images
Students know the components of an object detection pipeline (segmentation, localization, object recognition, ... ) as well as the most important algorithms for their implementation (e.g. Faster R-CNN, YOLO, SSD). Furthermore, they can explain how they work and judge their performance characteristics. Finally, they can integrate these components into applications.

Ensemble Methods
In this learning outcome, students will learn methods and algorithms that can be used to combine multiple models collaboratively. Ideally, this allows to exploit the strengths and suppress weaknesses of the combined models. Examples for how to combine models are bagging, boosting, random forests. You will get to know a selection of them.

Resampling
Here, students will learn how to examine the influence of variables and how to estimate errors of predictions. The most popular methods are permutation testing and bootstrapping.

Attention mechanism
Students know the concept and the working of the attention mechanism and how this is implemented in the Transformer architecture (e.g. for language translation).

Evaluation Mark
Built on the following competences Deep Learning

Foundation in Machine Learning
Modultype Portfolio Module
Diese Seite teilen: