# Master Theorem Given a divide and conquer algorithm with a time complexity described as $$ \begin{align*} T(n)&=aT(\frac{n}{b}) + \Theta(n^c)\\ T(1)&=O(1) \end{align*} $$ - $\Theta(n^c)$ is the total time complexity of dividing, combining and etc. - $a$ is the number of subproblems created from division - $b$ is the reduction factor for each subproblems | If | Then | | :-----: | :---------------------------------: | | $a<b^c$ | $T(n)=\Theta(n^c)$ | | $a=b^c$ | $T(n)=\Theta(n^{\log_b a}\log_b n)$ | | $a>b^c$ | $T(n)=\Theta(n^{\log_b a})$ | ## Proof for the Master Theorem We start the proof by solving the recurrence relation described above: $$ \begin{align*} T(n)&=aT(\frac{n}{b}) + n^c\\ &=a^kT \Big( \frac{n}{b^k}\Big)+n^c\Bigg[1+\Big(\frac{a}{b^c}\Big)^1+\Big(\frac{a}{b^c}\Big)^2+\Big(\frac{a}{b^c}\Big)^3+\cdots+\Big(\frac{a}{b^c}\Big)^{k-1}\Bigg]\\ &\text{(the loop stops when $\frac{n}{b^k}=1$ or $k=\log_bn$)}\\ &=a^{\log_b n}+n^c\sum_{i=0}^{k-1}\Big(\frac{a}{b^c}\Big)^i\\ &=n^{\log_b a}+n^c\sum_{i=0}^{k-1}\Big(\frac{a}{b^c}\Big)^i \end{align*} $$ This will lead us to two terms, the total work done on conquering ($C(n)=n^{\log_b a}$) and the total work done on dividing, combining and etc, ($D(n)=n^c\sum_{i=0}^{k-1}(\frac{a}{b^c})^i$). This means that the time complexity will depend on which among the two terms is more complex. This will ultimately be determined by the relationship of the constants $a$, $b$, and $c$. Specifically if the fraction $\frac{a}{b^c}$ is greater than, less than, or equal to 1. ### Case 1 If $a<b^c​$ then the fraction $\frac{a}{b^c}<1​$ which means that the geometric sum is decreasing. The biggest term in a decreasing geomentric sum is the first term, which is 1. Since we are solving this in terms of time complexeties, this is the only term that would matter. This means that the time complexity of dividing, combining and etc, is simply $n^c​$. $$ D(n)=\Theta(n^c) $$ This means that the total time complexity of the whole divide and conquer algorithm is: $$ T(n)=\Theta(n^{\log_b a}+n^c) $$ Since $a<b^c$ in this case, we also know that $\log_b a<c$. Which means that the time complexity in the first case of the master theorem is $$ T(n)=\Theta(n^c) $$ ### Case 2 If $a=b^c$ then the fraction $\frac{a}{b^c}=1$ which means that all terms in the geometric sum are equal to one. There are $k$ terms in the geometric sum. We know that $k=\log_b n$. Which means that the time complexity if combining is $$ D(n)=\Theta(n^c\log_b n) $$ This means that the time complexity of the divide and conquer algorithm is $$ T(n)=\Theta(n^{\log_ba}+n^c\log_bn) $$ Since $a=b^c$ then $\log_b a =c$, therefore the time complexity is $$ T(n)=\Theta(n^{\log_ba}\log_b n) $$ ### Case 3 If $a>b^c$ then the fraction $\frac{a}{b^c}=1$ which means that the geometric sum is increasing. This means that the biggest term in the series is the last term, which is $(\frac{a}{b^c})^{k-1}$ . We can then solve for the time complexity of $D(n)$ by substituting $\log_b n$ to $k$. $$ \begin{align*} D(n)&=\Theta\Big(n^c\Big(\frac{a}{b^c}\Big)^{k-1}\Big)\\ &=\Theta\Big(n^c\Big(\frac{a}{b^c}\Big)^{\log_b n}\Big(\frac{b^c}{a}\Big)\Big)\\ &=\Theta(n^cn^{\log_b \frac{a}{b^c}}) \tag{$\frac{b^c}{a}$ is removed since it is a constant}\\ &=\Theta(n^{c+\log_b \frac{a}{b^c}})\\ &=\Theta(n^{\log_b b^c+\log_b \frac{a}{b^c}})\\ &=\Theta(n^{\log_b b^c(\frac{a}{b^c})})\\ D(n)&=\Theta(n^{\log_b a}) \end{align*} $$ Which means that the total time complexity is, $$ T(n)=\Theta(n^{\log_ba}+n^{\log_b a}) $$ Or simply: $$ T(n)=\Theta(n^{\log_ba}) $$ ## Another look at the proof The masters theorem can also be proven by directly solving the recurrence relation $$ \begin{align*} T(n)&=aT(\frac{n}{b}) + n^c\\ &=a^kT \Big( \frac{n}{b^k}\Big)+n^c\Bigg[1+\Big(\frac{a}{b^c}\Big)^1+\Big(\frac{a}{b^c}\Big)^2+\Big(\frac{a}{b^c}\Big)^3+\cdots+\Big(\frac{a}{b^c}\Big)^{k-1}\Bigg]\\ &\text{(the loop stops when $\frac{n}{b^k}=1$ or $k=\log_bn$)}\\ &=a^{\log_b n}+n^c\sum_{i=0}^{k-1}\Big(\frac{a}{b^c}\Big)^i\\ &=a^{\log_b n}+n^c\Bigg(\frac{\Big(\frac{a}{b^c}\Big)^{\log_b n}-1}{\frac{a}{b^c}-1}\Bigg)\\ &=n^{\log_b a}+n^c\Bigg(\frac{n^{\log_b \frac{a}{b^k}}-1}{\frac{a}{b^c}-1}\Bigg)\\ &=n^{\log_b a}+\Bigg(\frac{n^{c+\log_b \frac{a}{b^k}}-n^c}{\frac{a}{b^c}-1}\Bigg)\\ &=n^{\log_b a}+\Bigg(\frac{n^{\log_b b^c+\log_b \frac{a}{b^c}}-n^c}{\frac{a}{b^c}-1}\Bigg)\\ T(n)&=n^{\log_b a}+\Bigg(\frac{n^{\log_b a}-n^c}{\frac{a}{b^c}-1}\Bigg)\\ \end{align*} $$ This leaves us with the time complexity of the polynomial above $$ T(n)=\Theta\Bigg(n^{\log_b a}+\Bigg(\frac{n^{\log_b a}-n^c}{\frac{a}{b^c}-1}\Bigg)\Bigg)\\ $$ The time complexity will depend on the relationship between the exponents $\log_b a$ and $c$. This matches on the three cases of the master theorem. w - If $\log_b a < c$ then it matches case 1. This means that the term $n^c$ will matter more than the term $n^{\log_b a}$. This means that the time complexity is indeed $\Theta(n^c)$ - If $\log_b a =c$ then it matches case 2. We cannot use the formula above since the denominator $\frac{a}{b^c}-1$ is zero. We instead make use of the fact that the geometric series is equal to $\log_b n$. Therefore, the time complexity is $\Theta(n^{\log_b n}\log_b n)$. - If $\log_b a>c$ then it matches case 3. This means that the term $n^{\log_b n}$ will matter more than the term $n^c$. This means that the time complexity is indeed $\Theta(n^{\log_b n})$.