# Information Theory $EE~ 432$ @ IIT Dharwad B. N. Bharath Assistant professor, Department of Electrical, Electronics and Communication Engineering (EECE), Academic Block A, $3$rd floor, Office # $316$ Chikkamalligewad, IIT Dharwad --- This is an undergraduate level information theory course. --- ## Commenting - Please write your name and roll number followed by your comments. This helps me know who has commented. Also, please do not hesitate to comment. All the comments will be taken seriously, and no comment is a stupid comment :grinning: --- ## Logistics The following student is the TAs for this course (will update soon!): - Sumit Sah (PhD student), A$1$ building, $3$rd floor, #$316$. - Special class on 28th January 2025 at 11 AM in CLT 1 103. **Evaluations:** Assignments $10$%, Class participation $10$%, Quiz $15\%$, Midsem $25\%$ and a final exam $40\%$. - <span style="color:red">Quiz-$1$ will be conducted on</span> $29$th January, $2025$ at $5$ PM. The following topic will be covered in the quiz - Basic probability theory (applying bounds) - More topics will be added by $20$th of January $025$ - The repeat exam or evaluation will not be encouraged. If someone misses the quiz for a genuine reason, then I may may use the final exam marks to grade with a possible penalty. The penalty will be decided based on the reason for absence. --- ## References - Elements of Information Theory by T M Cover & J A Thomas, Wiley 2006, 978-0471241959 [click here](https://onlinelibrary.wiley.com/doi/book/10.1002/047174882X) - Shannon's original paper (good read): [A Mathematical Theory of Communication.](https://www.commsp.ee.ic.ac.uk/~cling/IT/A%20Mathematical%20Theory%20of%20Communication.pdf) - Many online resources are available on this topic! --- ## Announcements - All the announcements will be made here. :::danger ### Assignment $1$ > Deadline to submit the assignment is $20$th January, $2025$, $5$ PM in my office or the lab. ::: --- 1. Show that $H(X_1,X_2, \ldots, X_n) = \sum_{i=1}^n H(X_i)$ when $X_i$'s are i.i.d. **($10$ points)** 2. Assume that you are computing an ML estimate of the bias of a coin by tossing the coin $n$ times. Assume that the bias is $p$. How many times should we toss so that the estimate satisfies the following: \begin{equation} \Pr\{|\hat{p}_n - p| \leq \epsilon \} \geq 1 - \delta, \end{equation} where $\hat{p}_n$ is the ML estimate, and $\delta > 0$. **($20$ points)** 3. Find the entropy of (i) $X \sim \texttt{Bern}(p)$, (ii) $X \sim \texttt{Unif}\{0,1,\ldots,N-1\}$. **($10$ points)** 4. Find the KL divergence between two Bernoulli distribution with biases $p_1$ and $p_2$. **($15$ points)** 5. Problems $1,2$ and $3$ from Chapter $2$ of the book. **($10$ points each)** 6. **Reading assingment:** Read and understand the solution to Problem $4$ in chapter $2$ of the book. Solution can be found in the Shannon's seminal paper (there are many but you know what I am refering to:grinning:). 7. Prove Krafts inequality when $D$-ary symbols are considered. **($15$ points)** --- :::danger **Assignement $2$** Deadline to submit the assingment is $29$th of January $2025$ (during the quiz) ::: 1. Sovle problem $5,6,7,9,15,16$ of chapter $2$ of the textbook. **($10$ points each)** 2. Solve problems $2$, and $6$ of chapter $3$ of the textbook. **($15$ points each)** 3. Solve problems $14$. Construct a Shannon-Fano code for the distribution of the source in problem $14$. **($10$ points each)** 4. **Reading assignment:** Read more about the Reyni entropy and understand the relationship between Reyni entropy and the Shannon entropy. --- :::danger **Assignement $3$** Deadline to submit the assingment is $4$th of February $2025$ ::: 1. Write a code to generate symbols from $\mathcal X := \{1,2,3,4,5,6\}$ according to a non-uniform distribution, and build a (i) Huffman code and (ii) a Shannon code. Think about the right metric to measure the perofrmance and experimentally show that Huffman is optimal. 2. Solve the following problems from Chapter $2$ - *(Shuffling increases entropy)* Prob. $31$, - *(entropy of initial condition)* Prob. $35$, - *(mixing increases entropy)* Prob. $28$, - *(pure randomness and bent coin)* Prob. $7$ and - *metric* Prob. $15$. 3. Solve the following problems from Chapter $8$ - ($Z$ channel capacity) Prob. $9$ - (processing does not improve capacity) Prob. $1$ - (noisy typewritter) Prob. $7$. :::danger **Assignment $4$** ::: 1. Find differential entropy of (a) exponential r.v., and (b) Laplace. **($10$ points)** 2. Let $(X,Y)$ be jointly Gaussian with zero mean and $\mathbb E (XY) = \rho \sigma^2$ and variance $\sigma^2$. Find the mutual information $I(X,Y)$. **($30$ points)** 3. Problems from chapter $10$: a. Problem $1,2$ and $3$. **($10$ points each)** 4. Chapter $13$: a. Problem $2,3,4,5$ and $8$. **($10$ points each)**