# Ch2. Performance Analysis.
- For Code, please visit- https://colab.research.google.com/drive/1HUOtDmb9OWRRBEawgQXp5yjHl6XPEpZ-?usp=sharing
## Motivation
**The problem is that to estimate computing runtimes of a script is hard**, in that if you really, really want to know how long a particular program will take to run on a particular computer, it's a huge mess. It depends on knowing all kinds of fine details about how the program works. And all kinds of fine details about how the computer works, how fast it is, what kind of system architecture it is.
**And we don't want to go through this huge mess every single time we try to analyze an algorithm.** So, we need something that's maybe a little bit less precise but much easier to work with. And the basic idea is the following. That, there are lots of factors that have an effect on the final runtime but, **most of them will only change the runtimes by a constant.** If you're running on a computer that's a hundred times faster, it will take one-one hundreth of the time, a constant multiple. If your system architecture has multiplications that take three times as long as additions, then if your program is heavy on multiplications instead of additions, it might take three times as long, but it's only a factor of three. So the key idea is if we **come up with a measure of runtime complexity that ignores all of these constant multiples**, where running in **time n** and in running in time **100 times n** are sort of considered to be the same thing, then we don't have to worry about all of these little, bitty details that affect runtime.
## Before the analysis, there are some practice sets of math formula serve as a foundation
## 1. Laws of Logarithms
* * **Power Rule:** $\log_{a}(n^k) = k \log_{a}n$
* * **Product Rule:** $\log_{a}(nm) = \log_{a}n + \log_{a}m$
* * **Exponent and Power Rule:** $n^{\log_{a}b} = b^{\log_{a}n}$
* * **Change of Base (Chain Rule):** $\log_{a}n \cdot \log_{b}a = \log_{b}n$
## 2. Logarithm Questions Practice sets
* **Power Rule:** $\log_{a}(n^k) = k \log_{a}n$
* **Product Rule:** $\log_{a}(nm) = \log_{a}n + \log_{a}m$
* **Change of Base (Extended Form):** $n^{\log_{a}b} = b^{\log_{a}n}$
* **Change of Base (Multiplicative Form):** $\log_{a}n \cdot \log_{b}a = \log_{b}n$
Is it true that $(\log_{5}n)^2 = 2 \log_{5}n$?
Is it true that $\log_{2}n \cdot \log_{3}2 = \log_{3}n$?
Is it true that $n^{\log_{2}n} = n$?
Is it true that $\log_{3}(2n) = \log_{3}2 \cdot \log_{3}n$?
Is it true that $\log_{10}(n^2) = 2 \log_{10}n$?
Is it true that $n^{\log_{7}3} = 7^{\log_{3}n}$?
## 3. Hierarchy of growth rate
Big O notation ($O(g(n))$) describes the asymptotic upper bound of a function's growth rate. A function $f(n)$ is $O(g(n))$ if there exist positive constants $c$ and $n_0$ such that for all $n \ge n_0$, $f(n) \le c \cdot g(n)$.
The general hierarchy of growth rates (from slowest to fastest) is:
$$\text{constant} < \log(n) < n^k < a^n < n!$$
## 4. Big O notation practice sets
Is it true that $\log_{2}n = O(n^2)$?
Is it true that $n \log_{2}n = O(n)$?
Is it true that $n^2 = O(n^3)$?
Is it true that $n = O(\sqrt{n})$?
Is it true that $n^5 = O(2^{3 \log_{2}n})$?
Is it true that $2^n = O(2^{n+1})$?
## 5. Space complextiy and Time Complexity
$S(P)= c + S_p$
c is the Fxied part, $S_p$ is the variable part
$T(P)= c + T_p$
c is the compile time, $T_p$ is the run time
We only focus on the **variable part** of space complexity and **run time** of the time complexity
## Precise RunTime Estimation
source of detail analysis
1. Computer Speed
2. Systemic Architecture (CPU Operation)
3. Compiler Optimization
4. Memory Hierarchy (Cache, RAM, Disk Lookup-speed)
# 📈 Asymptotic Notations for Complexity Analysis
Asymptotic notations are mathematical tools used to describe the growth rate of a function's complexity (time or space) as the input size ($n$) approaches infinity. These are also known as **漸近符號** (Jiànjìn Fúhào).
---
| Notation | Mathematical Definition | Meaning (中文) |
| :---: | :--- | :--- |
| **Big-O** ($O$) | $$f(n) = O(g(n)) \text{ If } \exists \text{ positive constants } c \text{ and } n_0 \text{ s.t. } f(n) \le c \cdot g(n) \text{ for all } n > n_0.$$ | **上界 (Upper Bound)**。描述**最差情況 (Worst Case)**的成長率。簡單來說就是取上界,有種取極限以忽略常數和非最大冪次項的感覺。 |
| **Big-Omega** ($\Omega$) | $$f(n) = \Omega(g(n)) \text{ If } \exists \text{ positive constants } c \text{ and } n_0 \text{ s.t. } f(n) \ge c \cdot g(n) \text{ for all } n > n_0.$$ | **下界 (Lower Bound)**。描述**最佳情況 (Best Case)**的成長率。取下界。 |
| **Big-Theta** ($\Theta$) | $$f(n) = \Theta(g(n)) \text{ If } \exists \text{ positive constants } c_1, c_2 \text{ and } n_0 \text{ s.t. } c_1 \cdot g(n) \le f(n) \le c_2 \cdot g(n) \text{ for all } n > n_0.$$ | **緊密界限 (Tight Bound)**。描述**平均情況 (Average Case)**的成長率,上下界一起取。 |