Try   HackMD

Information Theory
EE 432
@ IIT Dharwad

B. N. Bharath
Assistant professor,
Department of Electrical, Electronics and Communication Engineering (EECE),
Academic Block A,

3rd floor, Office #
316

Chikkamalligewad, IIT Dharwad


This is an undergraduate level information theory course.


Commenting

  • Please write your name and roll number followed by your comments. This helps me know who has commented. Also, please do not hesitate to comment. All the comments will be taken seriously, and no comment is a stupid comment
    Image Not Showing Possible Reasons
    • The image file may be corrupted
    • The server hosting the image is unavailable
    • The image path is incorrect
    • The image format is not supported
    Learn More →

Logistics

The following student is the TAs for this course (will update soon!):

  • Sumit Sah (PhD student), A

    1 building,
    3
    rd floor, #
    316
    .

  • Special class on 28th January 2025 at 11 AM in CLT 1 103.
    Evaluations: Assignments

    10%, Class participation
    10
    %, Quiz
    15%
    , Midsem
    25%
    and a final exam
    40%
    .

  • Quiz-

    1 will be conducted on
    29
    th January,
    2025
    at
    5
    PM. The following topic will be covered in the quiz

    • Basic probability theory (applying bounds)
    • More topics will be added by
      20
      th of January
      025
  • The repeat exam or evaluation will not be encouraged. If someone misses the quiz for a genuine reason, then I may may use the final exam marks to grade with a possible penalty. The penalty will be decided based on the reason for absence.


References


Announcements

  • All the announcements will be made here.

Assignment
1

Deadline to submit the assignment is

20th January,
2025
,
5
PM in my office or the lab.


  1. Show that

    H(X1,X2,,Xn)=i=1nH(Xi) when
    Xi
    's are i.i.d. (
    10
    points)

  2. Assume that you are computing an ML estimate of the bias of a coin by tossing the coin

    n times. Assume that the bias is
    p
    . How many times should we toss so that the estimate satisfies the following:
    Pr{|p^np|ϵ}1δ,

    where
    p^n
    is the ML estimate, and
    δ>0
    . (
    20
    points)

  3. Find the entropy of (i)

    XBern(p), (ii)
    XUnif{0,1,,N1}
    . (
    10
    points)

  4. Find the KL divergence between two Bernoulli distribution with biases

    p1 and
    p2
    . (
    15
    points)

  5. Problems

    1,2 and
    3
    from Chapter
    2
    of the book. (
    10
    points each)

  6. Reading assingment: Read and understand the solution to Problem

    4 in chapter
    2
    of the book. Solution can be found in the Shannon's seminal paper (there are many but you know what I am refering to
    Image Not Showing Possible Reasons
    • The image file may be corrupted
    • The server hosting the image is unavailable
    • The image path is incorrect
    • The image format is not supported
    Learn More →
    ).

  7. Prove Krafts inequality when

    D-ary symbols are considered. (
    15
    points)


Assignement

2
Deadline to submit the assingment is
29
th of January
2025
(during the quiz)

  1. Sovle problem
    5,6,7,9,15,16
    of chapter
    2
    of the textbook. (
    10
    points each)
  2. Solve problems
    2
    , and
    6
    of chapter
    3
    of the textbook. (
    15
    points each)
  3. Solve problems
    14
    . Construct a Shannon-Fano code for the distribution of the source in problem
    14
    . (
    10
    points each)
  4. Reading assignment: Read more about the Reyni entropy and understand the relationship between Reyni entropy and the Shannon entropy.

Assignement

3
Deadline to submit the assingment is
4
th of February
2025

  1. Write a code to generate symbols from

    X:={1,2,3,4,5,6} according to a non-uniform distribution, and build a (i) Huffman code and (ii) a Shannon code. Think about the right metric to measure the perofrmance and experimentally show that Huffman is optimal.

  2. Solve the following problems from Chapter

    2

    • (Shuffling increases entropy) Prob.
      31
      ,
    • (entropy of initial condition) Prob.
      35
      ,
    • (mixing increases entropy) Prob.
      28
      ,
    • (pure randomness and bent coin) Prob.
      7
      and
    • metric Prob.
      15
      .
  3. Solve the following problems from Chapter

    8

    • (
      Z
      channel capacity) Prob.
      9
    • (processing does not improve capacity) Prob.
      1
    • (noisy typewritter) Prob.
      7
      .

Assignment

4

  1. Find differential entropy of (a) exponential r.v., and (b) Laplace. (
    10
    points)
  2. Let
    (X,Y)
    be jointly Gaussian with zero mean and
    E(XY)=ρσ2
    and variance
    σ2
    . Find the mutual information
    I(X,Y)
    . (
    30
    points)
  3. Problems from chapter
    10
    :
    a. Problem
    1,2
    and
    3
    . (
    10
    points each)
  4. Chapter
    13
    :
    a. Problem
    2,3,4,5
    and
    8
    . (
    10
    points each)