Sub-topic 3: Shannon Capacity Theorem
The 3 most important aspects:
Entropy Theory, Model of a Digital communication, Shannon Capacity.
Shannon’s definition of communication is the problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point
Shannon’s Information Theory:
Information theory is measured by entropy. Entropy measure the amount of uncertainty from random variable or outcome random process. This random variable or process is described with probability.
[]
Fig. 1 Shannon’s entropy (H) in units of bits.
Pi is the probability of occurrence and i is the possible value of the source symbol, also it uses a algorithm base 2 because the entropy in the units of bits. Example for this equation that based on the probability mass function is tossing a dice or coins.
The special case of information entropy for random variable with two outcomes is binary entropy function:
Fig. 2
Joint entropy is another type of entropy when there are two discrete random variables X and Y is simply entropy of their pairing ;(X, Y). Joint entropy is the sum of two independent entropy of variables.
Fig. 3
This theory has a role in applications of information theory. It could be subdivided into two parts. Information theory using statistical description for data to be measure number of bits needed to be describe the data.
Shannon’s Source coding theorem established the possible limits of data compression using the probability theory of Shannon’s entropy. Shannon showed that to reliably store the information generated by some random source X, it needs no more/less than, on the average, H(X) bits for each outcome.
Huffman code is a coding technique that presented the systemic method to achieve the optimal compression ratio guaranteed by Shannon. The application of Huffman codes is in the compression and transmission of data fields, such as ax, modems, HDTV, computer network, etc.
Computer network case is the major of channel impairment and it called packet loss. Implement of packet loss could be like erasure in this case. This process called binary erasure channel means that the data erased or lost during transmitting.
Fig. 4
The best repetition based on Shannon’s answer is a channel and he will compute quantity called capacity, C for that channel. The reliable of communication is possible only if your data rates below C.
Model digital communication
Fig. 5
Noiseless channel
Bits (n) can be distinguished between all possible message and the smallest average number of bits per message could be measured by entropy function.
Noisy channel
Noise contamination of a communication channel possible to communicate nearly error free up to a computable maximum rate through the channel. There is Shannon’s capacity or limit for maximum rate of error-free data transferred if the link is subject to random data transmission error (for particular noise level). Type of noisy channel that should be consider is binary symmetrical channel (BSC), binary erasure channel (BEC) and noisy typewriter channel.