What is the difference between Bayesian and Markovian

Main navigation

eventweekdaymeetingplace
Lecture, 042529Monday16.15 - 18.00Otto-Hahn-Straße 12, room 2.063,
North Campus
Exercise, 042530Tuesday12.15 - 14.00Otto-Hahn-Straße 14, room E02,
North Campus

Content:

Graphic models are part of the efforts of modern information technology to enable reasoning under uncertainty. Prominent fields of application are robotics, bioinformatics, artificial intelligence and machine learning. For example, they are used in the evaluation of medical data, the analysis of gene expression data and the tracking of movements. Subject of the lecture are fundamental questions and techniques of graphic models, such as: Representation of probability distributions by means of graphic models, difference between directed and undirected graphic models, representation theorem, naive Bayes, logistic regression, explaining away, local Markov assumption, factorizations, independence from random variables . D separation, P maps, I maps. Algorithms for inferring under uncertainty: Enumeration, Variable Elimination, Perfect Elimination Orders, Moralizations, Triangulations, Junction Tree, (Loopy) Belief Propagation, Linear Programming, Sampling, Variational Inference. Complexity analysis of the inference and its characterization by means of the tree width. Process for learning the parameters of a graphic model from data: counting, gradient descent process, expectation maximization, Bayesian learning. Structure learning using Structured EM and Hill Climbing. Models for distributions that change over time (Hidden Markov Models, Dynamic Bayesian Networks) and for relational domains.

Competencies:

The aim of the module is to provide students with sufficient competence that enables them to actively design solutions to problems of probability modeling that arise every day using graphic models. In detail: Understanding of what graphic models are, knowledge of the basic and advanced methods for reasoning under uncertainty, knowledge of the basic and advanced machine learning methods for learning the parameters as well as the structure of graphic models from data, understanding of the interlinking of graphic models Models, knowledge representation and knowledge discovery. In particular, through a fundamental understanding of the principles of graphic models, students should be able to assess their possibilities and limits in certain fields of application.

Literature:

Probabilistic Graphical Models: Principles and Techniques (Adaptive Computation and Machine Learning series) (2009) Daphne Koller, Nir Friedman

Slides & exercises:

Slides of the lecture:
Exercises for the lecture:
  • Bayes' Rule warmup
  • Conditional Probability, Conditional Independence, Bayesian Networks
  • Conditional Probability, Conditional Independence, Bayesian Networks 2
  • Conditional Independence, Bayesian Networks, D-Separation
  • Factors, MPE, Variable Elimination
  • Marginalization, Independence, Clique Tree
  • Variable elimination in Clique Trees, MLE, EM
  • Structure learning on Bayesian Networks
  • Applications of Bayesian Networks
Resources for the project
There is no meeting on 01/20/2015, come to OH14-336 or send an email if you have any questions regarding the project.