Information measures: entropy, mutual information, Kullback-Libler divergence, data processing inequality.
Typical sequences : Weak and strong typicality, joint typicality
Methods od types: Sanov Theorem, combinatorical approach for bounding error probabilities.
Lossless source
coding (data compression) : block coding, data compression
using typical sets.
Random codes for transmissionm via noisy medium: channel capacity,
capacity computation, Achieving capacity through random coding,
converse
to the channel coding theorem, channels with feedback.
Coordination: rate-coordination function and its properties, quantization,
lossy source
coding theorem, converse to the coding theorem.
Joint source-channel coding: data processing, separation theorem.
Introduction to multi-user communication: networks, broadcast channel, multiple access channel (MAC), channel with states, relay channel, multi-terminal compression.
You should have seen some probability at the level of introduction to stochastic processes or equivalent. For instance, you should be familiar with terms such as i.i.d. random variables, expectation and Gaussian random variables.