Skip to main content

Module description - Bayesian Data Analysis
(Bayes’sche Datenanalyse)

Number
bda
ECTS 2.0
Specification Bayesian Thinking and Modelling
Level Advanced
Content Thomas Bayes was a clergyman in 18th century England, but became famous for a mathematical formula that founded an entire branch of statistics: Bayes' theorem - the basis of the Bayesian probability approach and Bayesian inference. The approach offers particular advantages when dealing with small data sets, heterogeneous sources of information, and for accounting for expert knowledge, and allows for a more intuitive way of hypothesis testing. Students acquire the competence to model data using the Bayesian approach, to update their models with new data, and to derive decisions from the models.
Learning outcomes LE 1: Bayesian probability theory with Bayes' theorem.
Students understand the Bayesian interpretation of probability and Bayes' theorem. They are familiar with the terms prior, likelihood, posterior, and evidence and can use them to calculate explicit discrete probabilities. They understand the differences between Bayesian and frequentist statistics and are aware of their advantages and disadvantages.

LE 2: Probability models with conjugate prior
If the prior and the likelihood distribution are so-called conjugate, the posterior distribution can be calculated explicitly ('closed form'). The students are able to evaluate probability models with a conjugate prior explicitly using available lookup tables and to apply the models to practical problems. In particular, they are very familiar with beta-binomial, gamma-Poisson and normal-normal models and understand why the beta and gamma distributions are often used as prior distributions to the binomial and Poisson distributions, respectively.

LE 3: Probability models with non-conjugate prior.
If the prior and likelihood distributions are non-conjugate, there is no explicit closed-form solution - the resulting posterior distribution must be estimated or optimized using sampling-based approaches such as Markov Chain Monte Carlo (MCMC) with the Metropolis-Hastings algorithm. Students will be able to estimate posterior distributions using the appropriate tools to answer practical questions.

LE 4: Bayesian Inference and Prediction.
Students are familiar with how a calculated posterior distribution can be used to estimate posterior credible intervals, how it can be used to perform hypothesis tests using the concepts of Bayesian statistics (concepts Bayes Factor and prior and posterior odds), and how predictions can be made on new data (concepts sampling and posterior variability)..
Evaluation Mark
Built on the following competences Probability Modellling (WER), Exploratory Data Analysis (EDA), Linear and Logistic Regression (LLR)
Modultype Portfolio Module
Diese Seite teilen: