---
title: Data stucture
tags: 111course
description: Use `{%hackmd theme-dark %}` syntax to include this theme.
---
# ***<div id="animation_title">Data stucture 資料結構</div>***
<span class="NOselect"><font size="2px">Edited by $\mathfrak{Warren}$ </font></span>
:::spoiler <span class="NOselect">文章目錄</span>
[toc]
:::
## ***ξ Ch5 Tree***
### **◭Tree Traversal**
建立node的基礎型別 <span class="sblue">**LVR**</span>
```cpp=
typedef struct node *tree_ptr;
typedef struct node{
tree_ptr left_child;
int data;
tree_ptr right_child;
};
```
The example for traversals>>

#### ***1. Inorder Traversal***
The elements get output would be:<span class="sblue">$$
A/B*C*D+E
$$</span>
*Recursive version*
```cpp=
void inorder(tree_node ptr){ //<- inputroot
//Recursive version
//inorder traversal LVR
inorder(ptr->left_child); //L
cout<<ptr->data; //V
inorder(ptr->right_child);//R
}
```
*Iterative version*
*we use a <span class="cblue">stack</span> to simulate recursion*
$\Downarrow$這個有點難懂$\Downarrow$
```cpp=
void iter_inorder(tree_node ptr){
int top=-1; //initialize stack
tree_node stack[MAX_STACK_SIZE];
for (;;){
for(;ptr;ptr=ptr->left_child)
add(&top,ptr); //add to stack
ptr = delete(&top);//delete from stack
if (!ptr) break; //empty stack
cout<<ptr->data;
ptr=ptr->right_child;
}
}
```
#### ***2. Preorder Traversal***
The elements get output would be:<span class="sblue">$$
+**/ABCDE
$$</span>
```cpp=
void preorder(tree_node ptr){
//Recursive version
//preorder traversal VLR
cout<<ptr->data;
preorder(ptr->left_child);
preorder(ptr->right_child);
}
```
#### ***3. Postorder Traversal***
The elements get output would be:<span class="sblue">$$
AB/C*D*E+
$$</span>
```cpp=
void postorder(tree_node ptr){
//Recursive version
//postorder traversal LRV
postorder(ptr->left_child);
postorder(ptr->right_child);
cout<<ptr->data;
}
```
#### ***4. Level-Order Traversal (using queue)***
The elements get output would be:<span class="sblue">$$
+*E*D/CAB
$$</span>
```cpp=
void level_order(tree_node ptr){
//
int front = rear = 0;
tree_node queue[MAX_QUEUE_SIZE];
if (!ptr) return;
addq(front,&rear,ptr);
for (;;){
ptr = deleteq(&front,rear);
if (ptr){
cout<<ptr->data;
if (ptr->left_child)
addq(front,&rear,ptr->left_child);
if (ptr->right_child)
addq(front,&rear,ptr->right_child);
}
else break;
}
}
```
### **◭Additional Binary Tree Operations**
#### ***1. Copying Binary Tree***
#### ***2.Testing Equality***
#### ***3. The Satisfiability Problem***
### **◭Threaded Binary Trees**
<font size="4px">***Why?***</font>
-\> **Too many null pointers in current representation of binary trees**
+ n: number of nodes
+ number of non-null links: n-1
+ total links: 2n
+ null links: 2n-(n-1) = n+1
+ n+1>n-1 <span class="sblue">**沒用的比有用的還多**</span>
<font size="2px">**Solution**</font>
+ replace these null pointers with some useful <span class="cblue">**“threads”**</span>
#### Rules for constructing the threads
+ If ptr-><span class="cblue">**left_child**</span> is null, replace it with a pointer to the node that would be visited <span class="cblue">**before**</span> ptr in an inorder traversal
+ If ptr-><span class="cblue">**right_child**</span> is null, replace it with a pointer to the node that would be visited <span class="cblue">**after**</span> ptr in an inorder traversal

+ Two additional fields of the node structure, *left-thread* and *right-thread*
+ If TRUE, then ptr->left-child contains a ***thread***;
+ If FALSE, then ptr->left-child contains a pointer to the ***left child*.**

**But if we do this, we can observe that H'left and G'right are dangling.**
**We want to fix this problem!!** $\quad$(YES GO AHEAD!!)

**We add a new root to fix this problem**
Root 的 right_child 是他自己 而且 right_thread 是 FALSE
所以 他會去找 right_subtree 的最左邊的小孩$\rightarrow$剛好是H 為 Inorder 的第一項
而且這樣也解決了H G的 dangling pointer problem.
>By using of threads we can perform an inorder traversal
>without making use of a stack
#### Inorder traversal of a threaded binary tree
#### Insertion of Threaded Binary Tree
<font size="30px">(未完)</font>
### **◭Heap**
#### **Definitions**
+ A max<font color="#999">(min)</font> tree is a tree in which the key value in each node is no smaller <font color="#999">(larger)</font> than the key values in its children.
+ A max <font color="#999">(min)</font> heap is a complete binary tree that is also a max <font color="#999">(min)</font> tree
#### **Example of Heaps**
+ The root of max heap contains the largeset element

+ The root of min heap contains the smallest element

#### **Insertion Into A Max Heap**
[跑程式示意](https://www.youtube.com/watch?v=DNAWJiPVrPI)
```cpp=
void insert_max_heap(element item, int *n){
//insrt item into a max heap of current size *n
int i ;
if (HEAP_FULL(*n)){
fprintf(stderr, "The heap is full. \n");
exit(1);
}
i = ++(*n);
while ((i!=1)&&(item.key>heap[i/2].key)){
heap[i] = heap[i/2];
i /= 2;
}
heap[i]=item;
}
```
#### **Deletion From A Max Heap**
[跑程式示意](https://www.youtube.com/watch?v=DNAWJiPVrPI)
```cpp=
element delete_max_heap(int *n){
//delete element with the highest key from the heap
int parent, child;
element item, temp;
if (HEAP_EMPTY(*n)){
fprintf(stderr,"The heap is empty\n");
exit(1);
}
//save value of the element with the hightest key
item = heap[1];
//use last element in heap to adjust heap
temp = heap[(*n)--];
parent =1;
child = 2;
while(child <=*n){
//find the larger child of the current parent
if (child <*n) && (heap[child],key<heap[child+1].key)
child++;
if (temp.key >= heap[child].key) break;
//move to the next lower level
heap[parent] = heap[child];
parent = child;
child *=2;
}
heap[parent] = temp;
retrun item;
}
```
### **◭Binary Search Trees(BST)**
#### **Definitions**
+ Every element has a unique key <span class="sblue">鍵值</span>
+ The keys in a nonempty left subtree (right subtree) are smaller (larger) than the key in the root of subtree
+ The left and right subtrees are also binary search trees

#### **Searching a binary search tree**
> *recursive search*
```cpp=
tree_pointer search(tree_pointer root,int key){
//return a pointer to the node which contain the key.
//if threre no such node return NULL.
if (!root) return NULL;
if (key==root->data) return root;
if (key<root->data)
return tree_pointer_search(root->left_ch, key);
return tree_pointer_search(root->right_ch, key);
}
//recursive search
```
> *iterative search*
```cpp=
tree_pointer search(tree_pointer tree,int key){
//return a pointer to the node which contain the key.
//if threre no such node return NULL.
while(tree){
if (key==tree->data) return tree;
if (key<tree->data)
tree=tree->left_ch;
else if (key>tree->data)
tree=tree->right_ch;
}
return NULL;
}
//iterative search
```
#### **Inserting into a binary search tree**
```cpp=
void insert_node(tree_pointer *node,int key){
//if num is in the tree pointed at by node do nothing
// otherwise add a new node with data = num
tree_pointer ptr;
temp = modified_search(*node, num);
if (temp||!(*node)){
// num is not in the tree
ptr = (tree_pointer)molloc(sizeof(node));
if (IS_FULL(ptr)){
cout<<"The memory is full\n";
exit(1);
}
ptr->data = num;
ptr->left_ch = ptr->left_ch =NULL;
if (*node)
if (num<temp->data) temp->left_ch=ptr;
else temp->right_ch = ptr;
else *node = ptr;
}
}
```
+ **$modified\_search$** is a slightly modified version of itersearch,
+ i.e., Iterative search of binary search tree,which search the binary tree *node for the key k.
+ If the tree is empty, it returns null. Otherwise, it returns a pointer to the last node of the tree that was encountered during the searchif k
#### **Deletion from a binary search tree**
+ Three cases should be considered
+ case 1. leaf $\rightarrow$ delete
+ case 2. one child $\rightarrow$ delete and change the pointer to this child
+ case 3. two child $\rightarrow$ either <span class="sblue"> the smallest element</span> in the right subtree or <span class="sblue">the largest element</span> in the left subtree

#### **Height of a binary search tree**
+ The height of a binary search tree with n elements can be n. <span class="sblue"> 很歪的歪斜樹
+ It can be shown that when insertions and deletions are made at random, the height of the binary search tree is O(log$_2$n) on the average.
+ Search trees with a worst-case height of O(log$_2$n) are called <span class="sblue">balance search trees</span>
#### **Time Complexity**
+ Searching, insertion, removal
>O(h), where h is the height of the tree
+ Worst case - skewed binary tree
>O(n), where n is the # of internal nodes
+ Prevent worst case
rebalancing scheme :
AVL, 2-3, and Red-black tree
### **◭Slection Trees**
## ***ξ Ch6 Graphs***
### **◭Definitions**
#### **A graph G consists of two sets**
- a finite, nonempty set of vertices(頂點) V(G)
- a finite, possible empty set of edges(邊) E(G)
<span class="blue">G(V,E) represents a graph</span>
- An undirected graph <span class="sblue">(無序圖)</span> is one in which the pair of vertices in an edge is unordered, (v0, v1) = (v1,v0)
<span></span>
- A directed graph <span class="sblue">(有序圖)</span> is one in which each edge is a directed pair of vertices, <v0, v1> != <v1, v0>
<span></span>
#### **Examples for Graphs**
- complete undirected graph: n(n-1)/2 edges
- complete directed graph: n(n-1) edges
  complete undirected & directed graph <span class="sblue"></span>
  <-incomplete graphs<span class="sblue">(不完全圖)</span>
#### **Restrictions on graphs**
- A graph may not have an edge from a vertex, i, back to itself. Such edges are known as self loops<span class="sblue">(自環)[.](https://zh.wikipedia.org/zh-tw/%E8%87%AA%E7%8E%AF)</span>
- A graph may not have multiple occurrences of the same edge. If we remove this restriction, we obtain a data referred to as a multigraph<span class="sblue">(重圖)[.](https://zh.m.wikipedia.org/zh-hk/%E9%87%8D%E5%9B%BE)</span>
<-self loops & multigraph
#### **Adjacent**
(v0, v1) is an edge
+ in an undirected graph,
v0 and v1 are <span class="sblue">adjacent</span> 
+ in a directed graph
v0 is <span class="sblue">*adjacent to*</span> v1, and v1 is <span class="sblue">*adjacent from*</span> v0 
#### **Subgraph**
A subgraph of undirected graph G is an undirected graph G’ such that
V(G’) $\subseteq$ V(G) and E(G’) $\subseteq$ E(G).
<- subgraphs of G1 <span class="sblue">(子圖)</span>
#### **Path**
- A path from vertex $v_p$ to vertex $v_q$ in a graph G, is a sequence of vertices,
$v_p, v_{i1}, v_{i2}, ..., v_{in}, v_q,$ such that $(v_p, v_{i1}), (v_{i1}, v_{i2}), ..., (v_{in}, v_q)$ are edges in an undirected graph.
- A path such as (0, 2), (2, 1), (1, 3) is also written as 0, 2, 1, 3
- The *length* of a path is the number of edges on it
#### **Simple path and cycle**
- simple path (simple directed path): a path in which all vertices, except possibly the first and the last, are distinct.
- A cycle is a simple path in which the first and the last vertices are the same.
#### **Connected component**
- Connected <span class="sblue">(相連)</span>
In an <span class="marker">undirected</span> graph G, two vertices connected, v0 and v1, are if there is a path in G from v0 to v1
An undirected graph is connected if, for every pair of distinct vertices vi, vj, there is a path from vi to vj
- Connected component <span class="sblue">(相連單元)</span>
A connected component of an undirected graph is a maximal connected subgraph.
A tree is a graph that is connected and acyclic (i.e, has no cycle).
 < H1 and H2 are connected component of graph
#### **Strongly Connected Component**
- Strongly Connected <span class="sblue">(強相連)</span>
- Strongly Connected component <span class="sblue">(強相連單元)</span>
A <span class="marker">directed</span> graph is strongly connected if there is a directed path from vi to vj and also from vj to vi
A strongly connected component is a maximal subgraph that is strongly connected
 < They're the strongly Connected component of a graph.
#### **Degree**:
+ The degree of a vertex is the number of edges incident to that vertex
+ For directed graph,
+ in-degree (v) : the number of edges that have v as the head
+ out-degree (v) : the number of edges that have v as the tail
+ If di is the degree of a vertex i in a graph G with n vertices and e edges, the number of edges is
$$
e=(\sum^{n-1}_0d_i)/2
$$
G1 and G2 are undirected graph, and G3 is the directed graph.

### **◭Graph Representations**
#### **Adjacency Matrix <span class="sblue">(鄰接矩陣)</span>**
+ The adjacency matrix of graph G is a two-dimensional array, say adj_matrix.
+ If the edge (vi, vj) is(not) in E(G), adj_matrix[i][j]=1(0)
 $\qquad$
+ The adjacency matrix for an undirected graph is symmetric;
+ The adjacency matrix for a directed need not symmetric.
#### **Merits of Adjacency Matrix**
+ For an undirected graph, the degree of any vertex, i, is its row sum:
$$
degrees=\sum^{n-1}_{j=0} adj\_matrix\;[i][j]
$$
+ For a directed graph, the row sum is the out-degree,
+ while the column sum is the in-degree.
$$
ind=(v_i)=\sum^{n-1}_{j=0} adj\_matrix\;[i][j]\quad outd=(v_i)=\sum^{n-1}_{j=0} adj\_matrix\;[j][i]
$$
+ The complexity of checking edge number or examining if G is connect
+ G is undirected: O(n$^2$/2)
+ G is directed: O(n$^2$)
#### **Adjacency lists**<span class="sblue">(鄰接串列)</span>
$\qquad$$\qquad$
+ There is one list for each vertex in G.
+ The nodes in list i represent the vertices that are adjacent from vertex i
>*Code for example*
```cpp=
#define MAX_VERTICES 50 //maximum for vertices
typeof struct node *node_pointer;
typeof struct node{
int vertex;
struct node *link;
};
node_poniter= graph[MAX_VERTICES];
int n=0; //vertices currently in use.
```
#### **Sequential Representation of Graph**
+ Sequentially pack the nodes on the adjacency lists
+ node[0] ~ node[n-1] gives the starting point of the list for vertex i , 0≦i<n.
+ node[n] stores “n+2e+1”
+ The vertices adjacent from vertex i are stored in node[node[i]], … , node[node[i+1]-1], 0≦i<n.
$\qquad$$\qquad$$\qquad$$\qquad$
#### **Interesting Operations**
+ degree of a vertex in an undirected graph
+ \# of nodes in adjacency list
+ \# of edges in a graph
+ determined in O(n+e)
+ out-degree of a vertex in a directed graph
+ \# of nodes in its adjacency list
+ in-degree of a vertex in a directed graph
+ traverse the whole data structure

#### **Adjacency Multlists**
+ All vertex are in the array $arr$ and every entry has a linked list to put adjacent nodes. (每個entry 都放著與這一個東西相連的節點)

+ An edge is shared by two different paths
There is exactly one node for each edge.
This node is on the adjacency list for each of the two vertices it is incident to

#### **Weighted edges**
+ The edges of a graph have weights assigned to them.
+ These weights may represent as
+ the distance from one vertex to another
+ cost of going from one vertex to an adjacent vertex.
+ adjacency matrix: adj_mat[i][j] would keep the weights.
+ adjacency lists: add a weight field to the node structure.
+ A graph with weighted edges is called a <span class="sblue">network</span>
### **◭Graph Operations**
#### **Traversal**
+ Given G=(V,E) and vertex v, find all w$\in$V, such that w connects v
+ <span class="sblue">Depth First Search (DFS)</span>: preorder traversal
+ <span class="sblue">Breadth First Search (BFS)</span>: level order traversal
#### **Depth first search (DFS):**
#### **Breadth First Search (BFS):**
#### **Connected components**
### **◭Spanning tree**
### **◭Single Source Shortest Paths**
## ***ξ Ch7 Sort***
## ***ξ Ch9 Leftist Trees***
+ Linked binary tree.
+ Can do everything a heap can do and in the same asymptotic complexity.
+ Can meld two leftist tree priority queues <span class="sblue">in O(log n) time</span>
#### **Extended Binary Trees**
+ Start with any binary tree and add an external node wherever there is an empty subtree.
+ Result is an extended binary tree.

#### **The Function s()**
+ Compute the sortest path from node to external node
+ **Properties of s()**
+ If x is an external node, then s(x) = 0.
+ Otherwise,
s(x) = min {s(leftChild(x)),s(rightChild(x))} + 1
#### **Height Biased Leftist Trees**
+ A binary tree is a (height biased) leftist tree iff for every internal node x, s(leftChild(x)) >= s(rightChild(x))

#### **Properties**
1. In a leftist tree, the rightmost path is a shortest root to external node path and the length of this path is s(root).
2. The number of internal nodes is at least 2$^{s(root)}$ - 1
So, s(root) <= log(n+1)
3. Length of rightmost path is O(log n), where n is the number of nodes in a leftist tree. <span class="sblue">(最右邊的那支最短 會和完全樹樹高一樣)</span>
#### **Leftist Trees As Priority Queues**
+ Min leftist tree : leftist tree that is a min tree.
Used as a min priority queue.
+ Max leftist tree : leftist tree that is a max tree.
Used as a max priority queue.

#### **Some Min Leftist Tree Operations**
+ meld() initialize(), (put(),remove()use the meld())
> Remove Min


#### **Weight Biased Leftist Trees**
+ A binary tree is a (weight biased) leftist tree iff for every internal node x,
W(leftChild(x)) >= W(rightChild(x))
where W(y) is the number of internal nodes in the subtree with root y.
+ Meld two Weight Biased Leftist Trees is same as the Height Biased Leftist Trees
<style>
@-webkit-keyframes A
{
0% { color:#fc4444;
padding-left:0px}
5% { color:;
padding-left:5px;}
10% { color:#fc7844;
padding-left:10px}
15% { color:;
padding-left:15px;}
20% { color:#fc7844;
padding-left:20px}
25% { color:;
padding-left:25px;}
30% { color:#e7fc44;
padding-left:30px}
35% { color:;
padding-left:35px;}
40% { color:#e7fc44;
padding-left:40px}
45% { color:;
padding-left:45px;}
50% { color:#7bfc44;
padding-left:50px}
55% { color:;
padding-left:45px;}
60% { color:#44fc9a;
padding-left:40px}
65% { color:;
padding-left:35px;}
70% { color:#44fc9a;
padding-left:30px}
75% { color:;
padding-left:25px;}
80% { color: #44fc9a;
padding-left:20px}
85% { color:;
padding-left:15x;}
90% { color: #9144fc;
padding-left:10px}
95% { color:;
padding-left:5px;}
100% { color: #fc44d7;
padding-left:0px}
}
#animation_title{
-webkit-user-select: none;
animation: A 3s ease 0s infinite alternate;
-webkit-animation: A 3s ease 0s infinite alternate;
}
.NOselect{
-webkit-user-select: none;
}
html, body, .ui-content {
background-color: #333;
color: #ccc;
}
.markdown-body h1,
.markdown-body h2,
.markdown-body h3,
.markdown-body h4,
.markdown-body h5,
.markdown-body h6 {
color: #ddd;
}
.markdown-body h1,
.markdown-body h2 {
border-bottom-color: #ffffff69;
}
.markdown-body h1 .octicon-link,
.markdown-body h2 .octicon-link,
.markdown-body h3 .octicon-link,
.markdown-body h4 .octicon-link,
.markdown-body h5 .octicon-link,
.markdown-body h6 .octicon-link {
color: #fff;
}
.markdown-body img {
background-color: transparent;
}
.ui-toc-dropdown .nav>.active:focus>a, .ui-toc-dropdown .nav>.active:hover>a, .ui-toc-dropdown .nav>.active>a {
color: white;
border-left: 2px solid white;
}
.expand-toggle:hover,
.expand-toggle:focus,
.back-to-top:hover,
.back-to-top:focus,
.go-to-bottom:hover,
.go-to-bottom:focus {
color: white;
}
.ui-toc-dropdown {
background-color: #333;
}
.ui-toc-label.btn {
background-color: #191919;
color: white;
}
.ui-toc-dropdown .nav>li>a:focus,
.ui-toc-dropdown .nav>li>a:hover {
color: white;
border-left: 1px solid white;
}
.markdown-body blockquote {
color: #bcbcbc;
}
.markdown-body table tr {
background-color: #5f5f5f;
}
.markdown-body table tr:nth-child(2n) {
background-color: #4f4f4f;
}
.markdown-body code,
.markdown-body tt {
color: #eee;
background-color: rgba(230, 230, 230, 0.36);
}
a,
.open-files-container li.selected a {
color: #5EB7E0;
}
.cblue {
color: #2890eb;
}
.sblue{
color: #91e069;
}
.kblue{
color: #c562fc;
}
.purple{
color: #8c03ab;
font-weight: 650;
}
.bgc{
background-color: #496968;
}
.marker{
background-color:#786060;
color: #faf26e;
}
.smtittle{
font-size: 20px;
}
</style>