Learn More β
ι±η΄ΉεΊ
r07945001@ntu.edu.tw
Table of Contents
Because median is surely not one of the extemes, we can try
Find-Extremes(P)
Is = [1,2,3]
for j = 4 to P.len+1
i = PANCAKE-GOD-ORACLE(P,Is...)
remove(Is[i])
if j == P.len+1
break
append(Is, j)
return Is
Therefore, totoal iteration is
The main idea is how to convert PANCAKE-GOD-ORACLE
into a comparison function, and use this comparison function to achieve the optimal of comparison sort.
Making a comparison function
Since we got the extremes, to say
compare(i,j) = i == PANCAKE-GOD-ORACLE(P[i], P[j], P_ext)
where
P[i] < P[j]
and P_ext
is maximum
compare(i,j) == 1
compare(i,j) == 2
P[i]
and P[j]
P_ext
is a minimum, vice versa.compare
is argmin
or argmax
function of P[i]
and P[j]
. But we can guarantee the monotonicity by using compare
in quick sort.Apply to comparison sort
Since we got the comparison function, we need to first allocate extremes to P[1]
and P[end]
. we can apply sorting algorithm, which is optimal in time complexity, to the sequence P[2]...P[end-1]
(Noted that at this moment, [2,...,end-1]
and use compare
function proved above. For clarity, I listed the pseudo code below which is modified from ITA P.171:
Quicksort(A,p,r)
if compare(p,r)
q =Partition(A,p,r)
Quicksort(A,p, q-1)
Quicksort(A, q+1,r)
end
Partition(A,p,r)
x = A[r]
i = p-1
for j = p to r-1
if compare(A[j],x) == 1
i = i+1
exchange A[j] with A[i]
exchange A[r] with A[i+1]
return i+1
end
Sort(P)
Is = Find-Extremes(P)
allocate Is to P[1] and P[end]
Quicksort(P,2,P.end-1)
end
Query Complexity
compare
which uses the PANCAKE-GOD-ORACLE
. Therefore, the overall query complexity is Reference:
The insertion takes in two parts:
Let another pancake be P_new
with testiness t_new
. And assume list L
is sorted with extremes locate at L[1]
and L[end]
. We want to find a site i
to insert P_new
such that L[i]
(Fig. 2-1).
Comparison to extrema
Pancake-God-Oracle(L[1], L[end], P_new)
output=1
: Insert P_new
at frontoutput=end
: Insert P_new
at endBinary Search
As described in P1-2, we can use extrema to build a comparison function from Pancake-God-Oracle
. After checking P_new
is between L[1]
and L[end]
, we can apply the comparison function to binary search , which acheive P_new
.
Insert(L, P_new)
i = Pancake-God-Oracle(L[1],L[end],P_new)
if i==1
insert P_new at 1 #case 1
elseif i==end
insert P_new at end #case 2
else #case 3
I = Binary-Search(L[2:end],P_new, func=compare)
insert P_new at I+1
end
compare(i,j)
I = Pancake-God-Oracle(i,j, L[1])
if I==i
return 1 # i>j
else
return 2
end
The reason to use bianry search:
P_new
is in the middle of the extremas, binary search is the fastest way to find an item in an sorted array.Fig. 1-1. Conditions of inserting a new value.
In problem 1-3, the insertion keeps the list sorted in
Initiation
Any list of two items is sorted.
Loop invariance
Insertion algoritm in Problem 1-3 keeps the inserted list sorted.
Termination
The last element is inserted with the same algorithm, so the list remains sorted.
On the other hand, the query complexity of one insertion is
Fig. 1-2. Any list of two items is sorted, and can be applied to the insertion algorithm described in Fig. 1-1.
In sorting with PANCAKE-GOD-ORACLE
, we use the medians between three arguments to gain order information about an input sequence PANCAKE-GOD-ORACLE(P,i,j,k)=Pi
. we can not inspect the values of the elements or gain order information about the tastes in any other way.
To sort the list of
Consider a decision tree of height
which, by using logarithms, shows
where
By the definition of
where
Fig. 1-3. A decision tree model of sorting PANCAKE-GOD-ORACLE(P,i,j,k)
. The syntax PANCAKE-GOD-ORACXLE(P,1,2,3)
, and the output is labelled on the links. The sorted list is described in the bracket
When
When swap
operation.
Suppose first ELF-SORT
call sorts the sub-list, the following operation remain the list sorted
We assume the first ELF-SORT
call (Sorted-1
in Fig.1-5). The second ELF-SORT
sorted the last Sorted-2
. Now, both of first ELF-SORT
call is to merge the first Sorted-2
with Sorted-1
. Because Sorted-3
is larger than those in Sorted-2
, and both Sorted-3
and Sorted-2
are sorted locally. The list is sorted by ELF-SORT
.
Fig. 1-5. Sorting procedures via ELF-SORT
. Three lists represent sorted lists after first, second and last ELF-SORT
call respectively. The unsorted region is labelled with random
; sorted region Sorted-i
where i
represents sorting by ELF-SORT
call,
π‘Idea
Suppose we have a worst-case running time
, estimate the process region of the next reucrsive level.
π¬Description
When length of list is ELF-SORT
function will proceed the list with length swap
operation is independent from
Let
let SWAP
.
Number of Swap
operation
The recursive function can be described in a tree structure (Fig. 1-4). The swap
operation occurs on the leaf when
Number of ELF-SORT
calling
In Fig.1-4, each node represents a function call. The total function call is
Time complexity of
Because
Therefore, the total time complexity is
Fig. 1-4. Time complexity of a recursive function. Time complexity is labelled on nodes and leaves. The figure is modified from [4].
π‘Idea
Turn left, and keep going right until reaching a NIL
π§Implementataion
function findPrev(t_k)
if(t_k.left == NIL)
return NIL
end
t_prev = t_k.left
while(t_prev.right != NIL)
t_prev = t_prev.right
end
return t_prev
end
Because avery sub-tree of
where the set of left nodes is in bold font. We can first proof that we can find the previous node of NIL
).
For finding
Fig. 2-1. Reconstructed binary tree[5].
Definition of root element
The definition of a root is always the first element of the preorder traversal. Thus, the determination of a root is unique, which is inorder[1]
.
Definition of leaves
With a given root/subroot, the definition of left and right sets of leaves is unique (Fig. 2-2). Recursively, all the set of leaves will be uniquely distributed with complete inorder and preorder traversal.
Conclusion
Ther is no two binary trees share with the same (inorder, preorder)
pair.
Fig. 2-2. Reucrsive algoritm of reconstructing a binary tree with preorder and inorder traversal [6].
π‘Idea
Fig. 2-3. Data structure for storing the indexes of inorder and postorder array.
π§Implementation
Constructing a look-up array
ββββstruct node
ββββ iIN //index in inorder array
ββββ iPRE //index in preorder array
ββββ left //leaves
ββββ right
ββββend
ββββfunction getPairedArr(inorder, preorder)
ββββ node arr[length(inorder)];
ββββ for i = 1 to length(inorder)
ββββ arr[inorder[i]].iIN = i
ββββ end
ββββ for i = 1 to length(preorder)
ββββ arr[preorder[i]].iPRE = i
ββββ end
ββββ return arr
ββββend
Reconstructing a binary tree
buildBST(inorder, preorder, iIN)
map = getPairedArr(inorder, preorder)
iNA = 1
root= _buildBst(map, inorder, preorder, &iNA)
return root
end
_buildBST(map, inArr, preArr, *iNA)
//Reach end
if (length(inArr)==0 || *iNA > length(preArr))
return NULL
end
//Asign new root
root = map[preArr[*iNA]]
Iin = map[preArr[*iNA]].iIN
*iNA++
root.left = _buildBst(map, inArr[1:Iin-1], preArr, iNA)
root.right = _buildBst(map, inArr[Iin+1:end], preArr, iNA)
return root
end
Note:
π‘Idea
x.key == v
, there is no need to move the element x
We need to know the new assigned value v
is greater or smaller.x.key
is increased:
x.key
is decreased:
Both Case 1 and Case 2 can be recursively (or loop) implemented. The only difference is the direction of movement.
Fig 3-1. Moving direction of node x
.
π§Implementation
h.modify(x, v){
if (x.key==v)
//Do nothing
else if(x.key>v)
h.increaseKey(x,v)
else
h.decreaseKey(x,v)
}
h.decreaseKey(x,v){
assert(x.key > v)
x.key = v
while(parent(x) != NULL and parent(x).key > v){
swap( parent(x), x)
}
}
h.increaseKey(x,y){
assert(x.key < v)
x.key = v
while( not all of x.leaves[:] is NULL ){
//Get the node with minimum key
min_node = argmin(x.leaves[:])
if(min_node.key < x.key){
swap(min_node, x)
}
}
}
Description
h.decreaseKey
: For going upward, there is only one neighbor. That is parent
h.increaseKey
: Going downward is more complicated. We need to choose the direction in the following conditions:
minimum
to swapNotes
x.leaves[:]
is for generosity, that means there can be leaves more than two like d-ary heapswap
function switches two variables.Definition
u
: undefine variable
N\M | 0 | 1 | 2 | 3 |
---|---|---|---|---|
0 | u | u | u | u |
1 | u | u | u | u |
2 | u | u | u | 1 |
3 | 4 | u | u | 2 |
D.add(2, 3, 1); D.add(3, 3, 2); D.add(3, 0, 4); ShowState();
N\M | 0 | 1 | 2 | 3 |
---|---|---|---|---|
0 | u | u | u | u |
1 | u | u | u | u |
2 | u | u | u | 1 |
3 | 4 | u | u | u |
D.extractMinRow(3) ; ShowState();
(c)
N\M | 0 | 1 | 2 | 3 |
---|---|---|---|---|
0 | u | u | u | u |
1 | u | u | u | 3 |
2 | u | u | u | 1 |
3 | 4 | u | u | u |
D.add(1, 3, 3); ShowState();
N\M | 0 | 1 | 2 | 3 |
---|---|---|---|---|
0 | u | u | u | u |
1 | u | u | u | 3 |
2 | u | u | u | u |
3 | 4 | u | u | u |
D.delete(2, 3); ShowState();
N\M | 0 | 1 | 2 | 3 |
---|---|---|---|---|
0 | u | u | u | u |
1 | u | u | u | u |
2 | u | u | u | u |
3 | 4 | u | u | u |
D.extractMinCol(3 ); ShowState();
π‘Idea
O(log(nm))
time compleixty, it is likey the result of two-heap-operation, which causes O(log(m) + log(n))
operation of a paired heapHow to know the availability of an item?
struct
or build another one-to-one map to store the data of availability. The status of an item can be:
Each column/row requires a heap to keep the track on the minimum. There will be
Each node should keep track on its location i
and j
. To
Fig 3-2. 2D array with two array of heaps.
π§Implementation: Data Structure
ββββstruct node{
ββββ key
ββββ i
ββββ j
ββββ iHr // location at row heap
ββββ iHc //location at col heap
ββββ avail//availability
ββββ}
ββββstruct heap{
ββββ node* array[N] //M
ββββ len //[0,N]/[0.M]
ββββ}
D
ββββstruct D{
ββββ heap Hc[M]
ββββ heap Hr[N]
ββββ node A[N,M]
ββββ}
π§Implementation: Operations
D.add(i, j, v)
ββββfunction add(d::D, i,j,v)
ββββ //Register A[i,j]
ββββ d.A[i,j].avail = true
ββββ d.A[i,j].i = i
ββββ d.A[i,j].j = j
ββββ d.A[i,j].key = v
ββββ insertHeap(d.Hr[i], &d.A[i,j])
ββββ insertHeap(d.Hc[j], &d.A[i,j])
ββββend
We have to extract all unavailable values until find the reachable minimum.
D.extractMinRow(i)
ββββfunction extractMinRow(d::D, r)
ββββ d.Hr[0].avail = false
ββββ extractMinHeap(d.Hr[r])
ββββ deleteHeap(d.Hc, d.Hr[0].iHc) //heap deletion
ββββend
Similar to D.extractMinCol
D.extractMinCol(i)
ββββfunction extractMinRow(d::D, c)
ββββ d.Hc[0].avail = false
ββββ extractMinHeap(d.Hc[c])
ββββ deleteHeap(d.Hr, d.Hc[0].iHr) //heap deletion
ββββend
D.delete(i, j)
ββββfunction delete(d::D, i, j)
ββββ d.A[i,j].avail = false
ββββ deleteHeap(d.Hr, d.A[i,j].iHr)
ββββ deleteHeap(d.Hc, d.A[i,j].iHc)
ββββend
In Fig. 3-2,
We need two heap insertions to track the minimum values of column j
and row i
. Therefore, the time complexity is O(log M + log N) = O(log MN)
The extraction on either col
or row
requires a following heap deletion. Both of them requires O(log(h))
. Therefore, the sum of complexity is O(log N + log M)
We need to delete the same item in two heaps. Use the iHc
and iHr
to keep track on the location. The deletion can be done by simply set d.Hrp[iHc] = -Inf
with heapify and extraction. All of them requires logarithmic time complexity on each side. In summary, the total time complexity is O(log N + log M)
Difference between Big-O and Little-O Notation. StackOverflow β©οΈ
CLRS, Introduction to Algorithm. 3rd edition. P. 99. β©οΈ
Construction of unique Binary tree. Book chapter β©οΈ β©οΈ
ββββjulia> a = [1,2,3,4]
ββββ4-element Array{Int64,1}:
ββββ1
ββββ2
ββββ3
ββββ4
ββββjulia> a[5:end]
ββββInt64[]