Pojdi na vsebino
- Entropy and mutual information
- Data compression: codes, Kraft’s inequality, Shannon's source coding theorem, Shannon code, Fano code, Huffman code, arithmetic code, dictionary codes, run-length encoding, less encoding
- Communication source and channel: memory-less source, discrete channel, channel capacity, Shannon’s channel coding theorem
- Noisy-channel coding: Hamming distance, Hamming condition, linear block codes, Hamming codes, cyclic codes, CRC, other codes
- Signals: sampling and quantisation, time-domain and frequency-domain description, Fourier series, Fourier transform, power spectrum, sampling theorem, reconstruction, frequency aliasing
- Distribution of hours per semester