Reference: Introduction to Algorithms by Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein.
Muḥammad ibn Mūsā al-Khwārizmī or al-Khwarizmi was a Persian polymath from Khwarazm – Wikipedia
According to Knuth, the word “algorithm” is derived from the name “al-Khowârizmî,” a ninth-century Persian mathematician. – CLRS
INSERTION-SORT(A, n) // output from small to large
for i = 2 to n // subarray A[1:i-1] is sorted
pickup = A[i]
j = i-1
// move elements in the subarray to the right
// making space for inserting the pickup
while j>0 and A[j] > pickup :
A[j+1] = A[j]
j = j-1
A[j+1] = pickup // insert the pickup
A[1:i-1]
is sorted
x^n
a constant-time instruction?x^n == x<<(n-1)
, \(O(1)\)INSERTION-SORT(A, n) # cost times for i = 2 to n # c1 n pickup = A[i] # c2 n-1 j = i-1 # c3 n-1 while j>0 and A[j] > pickup : # c4 sum_{i=2}^{n} m_i A[j+1] = A[j] # c5 sum_{i=2}^{n} m_i-1 j = j-1 # c6 sum_{i=2}^{n} m_i-1 A[j+1] = pickup # c7 n-1
\[ \begin{aligned} T(n) &= c_1 n + [c_2+c_3+c_7] (n-1) \\ &+ c_4 \sum_{i=2}^n m_i + [c_5+c_6] \sum_{i=2}^n (m_i - 1) \end{aligned} \]
\[ \begin{aligned} T(n) &= c_1 n + [c_2+c_3+c_7] (n-1) \\ &+ c_4 \sum_{i=2}^n m_i + [c_5+c_6] \sum_{i=2}^n (m_i - 1) \end{aligned} \]
\[ \begin{aligned} T(n) &= c_1 n + [c_2+c_3+c_7] (n-1) \\ &+ c_4 \sum_{i=2}^n m_i + [c_5+c_6] \sum_{i=2}^n (m_i - 1) \end{aligned} \]
Due to constant factors and lower-order terms, an algorithm whose running time has a higher order of growth might take less time for small inputs than an algorithm whose running time has a lower order of growth.
def MergeSort(A, left, right):
if left >= right: # zero or one element
return
mid = floor((left+right)/2)
MergeSort(A, left, mid)
MergeSort(A, mid+1, right)
Merge(A, left, mid, right)
[1] Alfred V. Aho, John E. Hopcroft, and Jeffrey D. Ullman. The Design and Analysis of Computer Algorithms, Addison-Wesley, 1974.)
INSERTION-SORT(A, n) for i = 2 to n pickup = A[i] j = i-1 while j>0 and A[j] > pickup : A[j+1] = A[j] # Line 6 - Move element by one position j = j-1 A[j+1] = pickup
for
loop runs n-1
timeswhile
loop might runs 0
(best) to i-1
(worst) times depending on the input array A
i<=n
INSERTION-SORT(A, n) for i = 2 to n pickup = A[i] j = i-1 while j>0 and A[j] > pickup : A[j+1] = A[j] # Line 6 - Move element by one position j = j-1 A[j+1] = pickup
A
into three pieces: A[1:n/3]
, A[n/3+1:2n/3]
, A[2n/3+1:n]
.A[1:n/3]
are greater than the rest of the input array A
.A[2n/3+1:n]
by the Line 6 in the above code.A[n/3+1:2n/3]
.INSERTION-SORT(A, n) for i = 2 to n pickup = A[i] j = i-1 while j>0 and A[j] > pickup : A[j+1] = A[j] # Line 6 - Move element by one position j = j-1 A[j+1] = pickup
Floor/Ceiling
floor(1.7)
\(= \lfloor 1.7 \rfloor = 1\)ceil(1.7)
\(= \lceil 1.7 \rceil = 2\)Modular
a%n
\(= (a \bmod n) = a - n \lfloor \frac{a}{n}\rfloor\)\(\lg n = \log_2 n\)
Base Case: Solve it directly
Recursive Case: Devide it recursively until reaching the base case (bottoms out). This invloves three steps:
Recurrence \(T(n)\) is algorithmic if, for every sufficiently large threshold constant \(n_0>0\), we have:
An example of recurrence \(T(n)\):
def Matrix_Multiply(A, B, C, n):
for i in range(n):
for j in range(n):
for k in range(n):
C[i,j] += A[i,k]*B[k,j]
def Matrix_Multiply_Recursive(A, B, C, n):
if n == 1: # Base Case
C[1,1] += A[1,1]*B[1,1]
return
else:
# Divide
[A11, A12, A21, A22, B11, B12, B21, B22, C11, C12, C21, A22] = Part(A,B,C)
# Conquer
Matrix_Multiply_Recursive(A11, B11, C11, n/2)
Matrix_Multiply_Recursive(A11, B12, C12, n/2)
Matrix_Multiply_Recursive(A21, B11, C21, n/2)
Matrix_Multiply_Recursive(A21, B12, C22, n/2)
Matrix_Multiply_Recursive(A12, B21, C11, n/2)
Matrix_Multiply_Recursive(A12, B22, C12, n/2)
Matrix_Multiply_Recursive(A22, B21, C21, n/2)
Matrix_Multiply_Recursive(A22, B22, C22, n/2)
It is hard to imagine that any matrix multiplication algorithm could take less than \(\Theta(n^3)\) time, since the natural definition of matrix multiplication requires \(n^3\) scalar multiplications. Indeed, many mathematicians presumed that it was not possible to multiply matrices in \(o(n^3)\) time until 1969, when V. Strassen [424] published a remarkable recursive algorithm for multiplying \(n \times n\) matrices
The running time of Strassen’s algorithm is \(\Theta(n^{\lg 7})\approx\Theta(n^{2.81})\)
[424] Volker Strassen. Gaussian elimination is not optimal, Numerische Mathematik, 14(3):354–356, 1969.