# Fundamental Knowledge for Wireless Communications
###### tags: `5G O-RAN` `IVC` `5G`
## Model of a Digital Communication System

### Terminology
1. Data Compression: Zip(Source Encoding)
2. Data Decompression: Unzip(Source Decoding)
3. Error Protection: Add CRC(Channel Encoding)
4. Error Correction: Verify CRC(Channel Decoding)
## Entropy Function
Using Probability Theory, Shannon showed that there is a way to measure information in terms of number of bits:

:::info
Example:
To store each outcome of a fair dice we only need 2.585 bits for storing each outcome:
$-\displaystyle\sum_{i=1}^{6}\dfrac{1}{6}\log_{2}{\dfrac{1}{6}}=\log_{2}{(6)}=2.5849...$
:::
### Binary Entropy Function
$H_b(p)=-p\log{p}-(1-p)\log{(1-p)}$

The entropy is maximized at 1 bit per trial when the two possible outcomes are equally probable, as in an unbiased coin toss.
## Huffman code
Huffman codes are used in nearly every application that involveds the compression and transmission of digital data, which is a method to achieve the optimal compression ratio guaranteed by Shannon.
## Erasure
Impairment like "packer loss" can be viewed as Erasures.

## Shannon's Channel Coding Theorem
>The reliable communication is possible only if your data rate stays below $C = 1-p$ (noiseless)

## Noisy Channel

With any communications system, the signal that is received may differ from the signal that is transmitted due to various transmission impairments. For digital signals, bit errors may
be introduced, such that a binary 1 is transformed into a binary 0 or vice versa.
### Solutions
1. Repetition code:
Replace each bit with 3 bits of the same value.
We will have n = 3m and probability of error
$1-((1-p)^3+3p(1-p)^2)^m=1-(1-3p^2+2p^3)^m$
Note that $1-p<1-3p^2+2p^3$, if $0<p<\dfrac{1}{2}$
2. Hamming code
- Able to correct any single error
- Able to detect any 2 errors