Sidan finns bara på engelska
TSKS12 Modern Channel Coding, Inference and Learning
The focus of this course is on modern coding techniques that find application in current communication standards. Iterative decoding procedures based on Baysian inference are presented. Basic machine learning concepts are described briefly in this course.
Course topics
- Information transmission, probability, entropy
- Mutual information
- The noisy channel coding theorem
- Computing channel capacity: The Gaussian channel
- Practical channel coding
- Repeat-Accumulate codes
- Digital fountain codes
- “Turbo” codes and LDPC codes
- Clustering
- Exact marginalization
- in Trellises: The Viterbi and BCJR Algorithms
- in Graphs: The Sum-Product Algorithm
- Monte Carlo methods
- Basics of supervised learning
- Capacity of a neuron
Instructors
- Course director and lecturer: Danyo Danev
- Teaching assistant: Unnikrishnan Kunnath Ganesan
Course material
- The course is based on the book
- David J.C. MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge University Press 2003.
- Complementary literature
- Shu Lin and Daniel J. Costello, Jr., Error Control Coding , Pearson / Prentice Hall 2004.
- Stephen Wicker, Error Control Systems for Digital Communication and Storage , Prentice Hall 1995.
- Peter Sweeney, Error Control Coding: From Theory to Practice , Wiley 2002.
- Salvatore Gravano, Introduction to Error Control Codes , Oxford 2001.
- David Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press 2012.
- Supplementary material: Lecture notes, answers to the tutorial problems, and material for the labs.
Prerequisites
- Probability theory and general mathematical maturity.
- Programming skills.
Information for enrolled students
For detailed lecture, tutorial and lab plans, see the course room in