Try   HackMD

Topic 3: Fundamental Knowledge for Wireless Communications

Sub-topic 3: Shannon Capacity Theorem

The 3 most important aspects:
Entropy Theory, Model of a Digital communication, Shannon Capacity.

Shannon’s definition of communication is the problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point

Shannon’s Information Theory:

  • Measurement of information(entropy)
  • Coding Theory:
    • Source coding theory
    • Channel coding theory

A. Measurement of Information

Information theory is measured by entropy. Entropy measure the amount of uncertainty from random variable or outcome random process. This random variable or process is described with probability.

Image Not Showing Possible Reasons
  • The image file may be corrupted
  • The server hosting the image is unavailable
  • The image path is incorrect
  • The image format is not supported
Learn More →

[]
Fig. 1 Shannon’s entropy (H) in units of bits.

Pi is the probability of occurrence and i is the possible value of the source symbol, also it uses a algorithm base 2 because the entropy in the units of bits. Example for this equation that based on the probability mass function is tossing a dice or coins.

The special case of information entropy for random variable with two outcomes is binary entropy function:

Image Not Showing Possible Reasons
  • The image file may be corrupted
  • The server hosting the image is unavailable
  • The image path is incorrect
  • The image format is not supported
Learn More →

Fig. 2

Joint entropy is another type of entropy when there are two discrete random variables X and Y is simply entropy of their pairing ;(X, Y). Joint entropy is the sum of two independent entropy of variables.

Image Not Showing Possible Reasons
  • The image file may be corrupted
  • The server hosting the image is unavailable
  • The image path is incorrect
  • The image format is not supported
Learn More →

Fig. 3

B. Coding Theory

This theory has a role in applications of information theory. It could be subdivided into two parts. Information theory using statistical description for data to be measure number of bits needed to be describe the data.

  1. Data compression (source coding)
    • Losses data compression (reconstruct the data properly)
    • Lossy data compression (distortion function using rate-distortion theory to measure specified accuracy)
  2. Error correcting codes (channel coding)
    • Error correcting codes adds redundancy while data compressing removes redundancy efficiently and across a noisy channel.

B.1. Shannon’s Source Coding

Shannon’s Source coding theorem established the possible limits of data compression using the probability theory of Shannon’s entropy. Shannon showed that to reliably store the information generated by some random source X, it needs no more/less than, on the average, H(X) bits for each outcome.

Huffman code is a coding technique that presented the systemic method to achieve the optimal compression ratio guaranteed by Shannon. The application of Huffman codes is in the compression and transmission of data fields, such as ax, modems, HDTV, computer network, etc.

B.2. Channel Source Coding

Computer network case is the major of channel impairment and it called packet loss. Implement of packet loss could be like erasure in this case. This process called binary erasure channel means that the data erased or lost during transmitting.

Image Not Showing Possible Reasons
  • The image file may be corrupted
  • The server hosting the image is unavailable
  • The image path is incorrect
  • The image format is not supported
Learn More →

Fig. 4

The best repetition based on Shannon’s answer is a channel and he will compute quantity called capacity, C for that channel. The reliable of communication is possible only if your data rates below C.

Model digital communication

Image Not Showing Possible Reasons
  • The image file may be corrupted
  • The server hosting the image is unavailable
  • The image path is incorrect
  • The image format is not supported
Learn More →

Fig. 5

Noiseless channel

Image Not Showing Possible Reasons
  • The image file may be corrupted
  • The server hosting the image is unavailable
  • The image path is incorrect
  • The image format is not supported
Learn More →

Fig. 6

Bits (n) can be distinguished between all possible message and the smallest average number of bits per message could be measured by entropy function.

Noisy channel

Noise contamination of a communication channel possible to communicate nearly error free up to a computable maximum rate through the channel. There is Shannon’s capacity or limit for maximum rate of error-free data transferred if the link is subject to random data transmission error (for particular noise level). Type of noisy channel that should be consider is binary symmetrical channel (BSC), binary erasure channel (BEC) and noisy typewriter channel.