Skip to main content

Module description - Natural Language Processing

Number
npr
ECTS 4.0
Level Advanced
Content Natural Language Processing (NLP) deals with the interaction between computers and human language and has become an attractive and relevant branch of artificial intelligence in recent years. NLP can efficiently solve routine tasks such as classifying documents, detecting sentiment in texts or responding to customer inquiries.
Many NLP applications are based on machine learning models that are trained to detect patterns in large text records.
Learning outcomes Processing and representation of text data
The aim of this learning outcome is to familiarise you with specific concepts of data processing (e.g. text normalization and tokenization), especially data cleansing and data augmentation for NLP applications. You know the possibilities and differences of the representation of words and sentences (e.g. TF-IDF, Word2Vec, GloVe, BERT, Sentence Transformers) using Sparse, Dense, and Contextual Embeddings and can use them effectively.

Statistical and neural language models
In this learning outcome you will work towards understanding the principles of different types of language models (e.g. Word n-gram models, RNNs, transformer-decoder LLMs). You will learn how to train and evaluate language models (e.g. BLEU, WER, Perplexity) and how to create texts using language models and decoding techniques (e.g. Greedy Decoding, Beam Search).

Transformer-based model algorithms
In this learning outcome you will gain an understanding of how transformer-based models work and how they can be configured differently for specific NLP applications, e.g. text classification, translation or generation using encoders and decoders and how they can be evaluated using metrics. You also understand the fundamental improvement achieved with Transformer-based models over classic machine learning and deep learning approaches such as Seq2Seq (e.g. LSTM, GRU).

Learning Methods for NLP
NLP has made remarkable progress in understanding and generating natural language, using different learning methods such as pre-training, fine-tuning or in-context learning to tackle tasks such as text classification, generation or question answering.
This learning outcome aims to enable you to describe, differentiate, use and combine NLP learning methods.

NLP Tools & Frameworks
When working on the mini-challenges, you will use existing NLP / Deep-Learning Frameworks, which greatly simplify the implementation of the models. Thus, in this learning outcome, you will get to know and use common Python frameworks (e.g. spaCy, HuggingFace, PyTorch).
Evaluation Mark
Built on the following competences

  • Grundlagen der Linearen Algebra (gla)
  • Grundlagen der Analysis (gan)
  • Grundkompetenz Programmieren (gpr)
  • Grundkompetenz Machine Learning (gml)
  • Deep Learning (del)

Modultype Portfolio Module
Diese Seite teilen: