Hauptinhalt überspringenNavigation überspringenFooter überspringen
Logo der Fachhochschule Nordwestschweiz
Studium
Weiterbildung
Forschung und Dienstleistungen
Internationales
Die FHNW
De
Standorte und KontaktBibliothek FHNWKarriere an der FHNWMedien

      Logo der Fachhochschule Nordwestschweiz
      • Studium
      • Weiterbildung
      • Forschung und Dienstleistungen
      • Internationales
      • Die FHNW
      De
      Standorte und KontaktBibliothek FHNWKarriere an der FHNWMedien
      Module
      Deep Learning

      Deep Learning

      Number
      del
      ECTS
      3.0
      Level
      Advanced
      Content

      Deep learning has gained enormous attention in the last decade. Significant improvements have been achieved in application areas that were traditionally considered easy for humans but difficult for machines - especially in the areas of computer vision (e.g. object recognition) and natural language processing (e.g. language translation and understanding, chat bots).


      The main goal here is to learn some of the basics of deep learning and a few typical applications. This includes learning techniques for training deep architectures, as well as some well-established deep learning architectures mostly from the computer vision domain.

      Learning outcomes

      Training neural networks with Stochastic Gradient Descent

      Students understand how Stochastic Gradient Descent can be used to train neural networks and how the backpropagation algorithm works, which is used to calculate the gradients in neural networks. The understanding should be developed based on a network of the Multi-Layer Perceptron (MLP) type. Students will also understand how networks with only one hidden layer can approximate arbitrary functions and have an intuition for how networks with many layers help to extract complex patterns in the data, as well as to counter the "Curse of Dimensionality" by efficiently representing hierarchical concepts.


      Training Deep Neural Networks

      Students know the difficulties that can arise in particular when training deep neural networks (e.g. "vanishing / exploding gradients") and know how these can be mitigated (e.g. various activation functions, batch normalization, parameter initialization). They know how various regularization methods work and their benefits (e.g. dropout, data augmentation) and how they can be applied. They know the most important measures to improve convergence behavior when learning with simple gradient descent (Adam, etc.). Finally, students can use learning curves and performance metrics to evaluate the success of training.


      CNN

      Students know how the elementary building blocks of Convolutional Neural Nets (CNN) work and their benefits and understand how they can be configured and combined in a meaningful way. They know the most important deep computer vision architectures and which innovations have been introduced with them. They know how the information encoded in trained networks can be made visible, e.g. with the help of Activation Maximization.


      Transfer Learning

      Students know how pre-trained networks can be reused with transfer learning and what to pay attention to in a specific problem setting. They know a few frequently used pre-trained networks from the field of computer vision. They also know what foundation models are and how and for what they have been trained.


      Tools

      Students are familiar with the most important deep learning frameworks (currently TensorFlow and PyTorch). They know how gradients can be calculated automatically in these frameworks on the basis of computational graphs (autograd). They can independently implement typical application examples for a selected framework and know how models can be executed efficiently (in particular also on the GPU). Furthermore, students know the common options (including tools) for validating the implementation of models, monitoring their training or analyzing their functionality.

      Evaluation
      Mark
      Built on the following competences
      • Foundation in Machine Learning,
      • Foundation in Linear Algebra, Foundation in Calculus, Advanced Calculus.
      Modultype
      Portfolio Module
      (German Version)

      Studium

      Angebot

      • Studium
      • Weiterbildung
      • Forschung & Dienstleistungen

      Über die FHNW

      • Hochschulen
      • Organisation
      • Leitung
      • Facts and Figures

      Hinweise

      • Datenschutz
      • Accessibility
      • Impressum

      Support & Intranet

      • IT Support
      • Login Inside-FHNW

      Member of: