361-2-5981 Random Codes in Communication
Spring Semester 2011
- Solutions to all HW are posted. Some additional question on differential entropy are posted (do not turn in)
- Material learned in class on MIMO can be found at http://arxiv.org/abs/1001.3404 on page 240, lecture 10.
cheating sheet (summary of equation) is posted in the lecture notes.
Each student may bring to the final one sheet (2 pages) of equation
written in latex. If you do not use the standard one, please send it to
me the day before the final by email.
- Two additional lectures are added. In lecture 3 we added a
general proof for Kraft inequalty and in Lecture 6 we added the BEC
- final practice part A: 2005 final
2005 final sol
: No need to solve question 1 and 5,
which are on gambling and entropy of Markov process.
- final practice part B: final final sol
- finals from 2009 (version a , partial solustions for a, version b, partial solustions for b). No need to solve rate distorstion problem.
- final 2010 partial solution for final 2010
- Here is a movie
on Shannon and his original paper from 1948, "A Mathematical Theory of
- The course will be similar to Mathematical
Communication and adjusted to complement
the Information theory course taought by Prof. Lyandress.
- The course is open for undergrad and graduate
- Non EE students, such as Math and CS students, are also encouraged to take the course.
Lecturs Tuesday 17:00-20:00pm, 3 Units,
Building -, room - .
- Instructor: Haim Permuter
Building 33 office 312
- Tel: 6461558
- Email: haimp (at) bgu (dot) ac (dot) il
- Office hours: After the lectures.
Course Outline and
In the class
we will study fundamental mathematical methods which are widely used
in designing modern communication systems (Wimax, Wifi, ADSL, OFDM),
modern codes (LDPC and Turbo codes) and compression schemes (ZIP). We
optimization, probability and information tools to solve engineering
problems in communication. These
mathematical/information tools are
also widely used in signal processing (estimation), statistics (large
deviation), computer science (complexity), physics (second law of
thermodynamic and statistical physics) and
economics (investment theory).
Here is a movie
on Shannon, who invented the mathematical tools
learned in the class.
measures: entropy, mutual information, Kullback-Libler
divergence, data processing inequality.
concepts in convex optimization: convex set, convex
function, Jensen’s inequality.
sequences : Weak and strong typicality, joint typicality
Methods od types: Sanov Theorem, combinatorical
approach for bounding error probabilities.
coding (data compression) : block coding, data compression
using typical sets.
Random codes for transmissionm via noisy medium: channel capacity,
capacity computation, Achieving capacity through random coding,
to the channel coding theorem, channels with feedback.
Coordination: rate-coordination function and its properties, quantization,
coding theorem, converse to the coding theorem.
source-channel coding: data processing, separation theorem.
- Introduction to Random
coding: LDPC and Turbo
Introduction to multi-user
communication: networks, broadcast channel, multiple
access channel (MAC), channel with states, relay channel,
The course will follow only one textbook, which is fun and easy to
- Textbook: "Elements
Theory," by Cover and Thomas, 1st or 2nd Edition, New York:
You should have seen some
probability at the level of introduction to
stochastic processes or
equivalent. For instance, you should be familiar with terms
as i.i.d. random variables, expectation and Gaussian random variables.