Theme-Logo
  • Login
visualization
  • FC Overview
  • FC Course graph
  • FC Course Similarity
  • FC Curricula Similarity
  • FC Trajectory
  • FC Upload files
Info
  • FC Publications
  • FC R&D team
  • FC Feedback
node
Id
Label
Node Type
Cognitive Level
Description
Similar nodes pairs

Introduction to Information Theory

(Year: 3 Period: 2 Category: Elective )


Create
Evaluate
Analyze
Apply
Understand
Remember
Course
Knowledge Unit
Knowledge Point / Sub Knowledge Point

* Please save changes after editing the graph

Node Description:
Click on a node to see its description.
Course Objectives:
  • 1. (understand)Compute the probability of an even using the most common discrete probability distributions (Bernoulli, binomial, and geometric).
  • 2. (understand)Compute inverse probabilities using Bayes' rule.
  • 3. (understand)Compute the means and variances of commonly used probability distributions.
  • 4. (understand)Compute the means and variances of sums or products of random variables with known distributions.
  • 5. (understand)Bound the probability of an extreme event using inequalities such as the Markov bound, Chebyshev's inequality, or Hoeffding's inequality.
  • 6. (understand)Compute the entropy of a random variable.
  • 7. (understand)Compute the mutual information between two random variables.
  • 8. (understand)Use entropy diagrams to reason about the relative size of the entropies, conditional entropies, and mutual information of two or three random variables.
  • 9. (understand)Use Jensen's inequality to bound the mean of a random variable defined in terms of a convex or concave function of another random variable.
  • 10. (understand)Construct a d-ary Huffman code for a random variable.
  • 11. (understand)Use Kraft's inequality to check whether a prefix-free code can be constructed to fit certain codeword lengths.
  • 12. (understand)Bound the possible rate of lossless compression of output from a given source using Shannon's source coding theorem.
  • 13. (understand)Define a typical set and reason about its size, probability, and elements.
  • 14. (understand)Compute the Shannon-Fanos-Elias codeword for a sample from a stochastic process.
  • 15. (understand)Compute the entropy rate of a Markov process.
  • 16. (understand)Construct a probability model of a communication channel given a verbal description.
  • 17. (understand)Compute the channel capacity of a channel.
  • 18. (understand)Use Shannon's channel-coding theorem to bound the achievable rate of reliable communication over a channel.
  • 19. (understand)Use Bayes' rule to decode corrupted messages sent using an error-correcting code.
  • 20. (understand)Evaluate the rate and reliability of such codes.
  • 21. (understand)Define the jointly typical sets of a source and channel, and use such sets to decode outputs from the channel.

Related Course