Coding and information theory

Model of communication systems. The concept of information. Definition of the amount of information. Discrete sources without memory. The properties of entropy. Discrete sources with memory. The entropy of Markov sources. Continuous sources of information. Statistical coding. Kraft's inequality. Immediate code. Types of prefix codes. The compact code. The first Shannon's theorem.

Methods of construction of compact code. The procedure of Shannon and Fano. Huffman’s procedure. Statistical model of channels. Discrete channels without memory. Capacity of discrete channels. Discrete channels with memory. Continuous channels. Capacity of continuous channels. Protective coding. Second Shannon's theorem. The probability of error. Hemming's distance. Block codes. Convolution codes. Viterbi algorithm. Basic trellis of coded modulation.

Рачунарски факултет Рачунарски факултет 011-33-48-079