---
tags: fall 2018 cs61b
---
# Sorting Notes
[TOC]
## Comparison Sorts
## Selection Sort
We loop through the array to find the smallest element. Next, we swap the element at index 0 with the smallest element. Next, we repeat this procedure, but only looking at the array starting at index 1. We repeat the same algorithm until we finish looking through all the indices in the array.
![](https://i.imgur.com/lyNx1N3.gif)
### Runtime
**Best & Worst Case:** $Θ(N^2)$
Since it takes **O($N$)** time to loop through the array, and we loop through the array **N times**, this algorithm has a runtime of **Θ($N^2$)**. Note that **even if the array is already sorted**, we need to iterate through it to find the minimum, and then iterate through it again, and again, **N times**.
**Stable? Depends**
Consider an array [3A, 2, 3B, 1 ], where the 3's have been labeled to differentiate between them. The algorithm will find 1 to be the smallest, and will swap it with 3A, pushing 3A after 3B, making it not stable. However, it is also possible to make it stable if we implement Selection Sort in a different way, which involves creating a new array instead of swapping the minimum elements.
## Insertion Sort
For every item, swap until it is in the correct relative position in regard to the items before it. This is the way an adult would normally sort a pack of cards: iterating through the array, swapping each element left-wards.
![](https://i.imgur.com/I6sbDum.gif)
### Runtime
**Best Case:** $Θ(N)$
In the best case scenario, we're given a **sorted** array such as [1, 2, 3, 4, 5]. If we run insertion sort on this array, we do not need to make any comparisons or swaps leftward.
**Worst Case:** $Θ(N^2)$
In the worst case, we're given an array that is in **reverse sorted order** such as [5, 4, 3, 2, 1]. We need to make approximately $N$ **comparisons and swaps** as we iterate through through each element in the array.
**Stable? Yes**
Consider an array [ 3A, 2, 3B, 1] where the 3's have been labeled to differentiate between them. We would get the following steps: [2, 3A, 3B, 1 ], [1, 3, 3A, 3B]. In general, this algorithm is stable, because given a 3A and 3B, we would never swap them with each other.
## Merge Sort (Top-Down)
Given an array, divide it into two equal halves, and call mergesort recursively on each half. Take the recursive leap of faith and assume that each half is now sorted. Merge the two sorted halves. Merging takes a single iteration through both arrays, and takes **O(N)** time. The base case is if the input list is just 1 element long, in which case, we return the list itself.
![](https://i.imgur.com/D2c1Rh9.gif =600x360)
### Runtime
**Best case & Worst Case:** $Θ(N \textrm{ log } N)$
Since the algorithm divides the array and recurses down, this takes $Θ(N \textrm{ log } N)$ time, no matter what. First, it would take $log N$ levels to divide the entire array into subarrays of length 1. Each level requires $N$ amount of work
**Stable? Yes**
Merge sort is made stable by being careful during the merging step of the algorithm. If deciding between 2 elements that are the same, one in the left half and one in the right half, pick the one from the left half first.
## Heap Sort
Heap Sort place all elements in the array into a **max heap**. It then removes elements one by one from the heap, and places them in an array.
![](https://i.imgur.com/fi2uJ94.gif)
### Runtime
**Best Case:** $Θ(N)$
Say that all the elements in the input array are equal. In this case, creating the heap only takes $\Theta(N)$ time, since there is no bubbling-down to be done. Also, removing from the heap takes constant time for the same reason. Since we remove $N$ elements, and creating the heap takes $\Theta(N)$ time, the overall runtime is $\Theta(N)$.
**Worst Case:** $Θ(N log N)$
Any general array would require creating the heap with bubbling which itself takes $Θ(N log N)$ time.
**Stable? Depends, but difficult to make stable**
Our implementation of heapsort is usually not stable (If that's hard to believe, it helps to run through a few examples).
## Quick Sort
The key idea here is **partitioning**. We start by picking a **pivot**, which is an element in our list. Then we group together elements smaller than or equal to the pivot and greater than the pivot. After this partitioning is done, elements equal to the pivot are in the right place. Then, quicksort the "less than" and "greater than" subarrays.
![](https://i.imgur.com/tGplB7r.gif =600x360)
### Runtime
**Best Case:** $Θ(N)$
In the best case, we have an array with the same elements such as [1, 1, 1, 1, 1]. When we choose a pivot, compare every element in the array to the pivot, and partition them into "greater than" and "less than" arrays, the two arrays will end up being empty. There will be nothing to recurse on. Therefore, the runtime mainly comes from comparing the pivot to every element in the array, which is $Θ(N)$. (This only occurs with Hoare partitioning!)
**Worst Case:** $Θ(N^2)$
In the worst case, we choose a bad pivot that completely fills either the "greater than" or "less than" array and leaves the other one empty.
An example is an array is already sorted and we pick the leftmost element as the pivot every time. Consider the array: [1, 2, 3, 4, 5]. If we choose the leftmost element as the pivot every time, the "less than" array will always be empty and the "greater than" array will be completely filled with the remaining elements. Therefore, we would be iterating through approximately **N elements** for the **N recursive calls**.
**Average Runtime:** $Θ(N \textrm{ log }N)$
If we have a reasonably good pivot that splits the elements into equal subarrays and the array is full of distinct elements, we would have $\textrm{log }N$ **function calls** and iterate through all $N$ elements for comparison. This is the same thing as mergesort!
**Stable?** Depends.
It depends on the pivot and partitioning.
## Why QuickSort?
At first glance, Quicksort is oddly named. Its worst case runtime isn't very nice and its best case runtime matches that of Mergesort. Why not just choose mergesort over Quicksort every time, especially since MergeSort sorts stably?
1. Quicksort takes up less memory than mergesort. Mergesort requires you to create new arrays to sort the elements while quicksort sorts in-place.
2. It is very unlikely for Quicksort to ever hit its worst case runtime. In order for Quicksort to have a runtime of $N^2$, we must choose a pivot that happens to fill one subarray with the rest of the elements *every single time*. In fact, choosing the pivot randomly each time will rarely result in the worst case scenario.
## Counting Sort
The motivation behind counting sort is to use it when we know the number of unique elements that we want to sort.
### How it Works
Screenshoted from [CS 61BL Lab 28](https://cs61bl.org/su18/labs/lab28/):
> ![](https://i.imgur.com/XTzRYqc.png)
> ![](https://i.imgur.com/gpWk4Od.png)
> ![](https://i.imgur.com/hrIJGiK.png)
![](https://i.imgur.com/Xrk8bou.png)
### Runtime
If $N$ is the number of elements in the list and $K$ is the size of the alphabet, then the runtime is $\Theta(N+K)$. Updating the `counts` array requires $N$ work iterating through the original array. Updating the `starts` array requires $K$ work iteratering through the `counts` array. Since these updates occur one after one another, all we have to do is *add* these runtimes together to get the final runtime.
## Radix Sort
The **radix** of a number system is defined to be the base of the system (the number of values that a digit can take on). For instance, binary numbers can be defined as *base-2* or *radix-2*.
### LSD Sort
Run counting sort based on the **least significant key**. Then, move to the next key on the left and run counting sort based on that key. Keep doing this until there are no more keys left.
![](https://i.imgur.com/lq7M0ov.png)
**Runtime:** $\Theta(W(N+K))$
Assume $N$ is the number of elements in the list, $K$ is the size of the alphabet, and $W$ is the length of the longest element. Since we are performing counting sort (with a runtime of $\Theta(N+K)$) as we look through every key in our elements, the runtime would be $\Theta(W(N+K))$.
### MSD Sort
Run counting sort based on the **most significant key**. Group all items with the same key into the same bucket. Then, run radix sort on the elements in each bucket on the next key on the right. Keep doing this until there are no more keys left.
![](https://i.imgur.com/0zzqPbj.png)
**Runtime:**
Assume $N$ is the number of elements in the list, $K$ is the size of the alphabet, and $W$ is the length of the longest element.
**Best Case:** $\Theta(N+K)$
This scenario occurs when the most significant keys in the elements are unique. On the first runthrough of counting sort with the most significant digit, all of the elements are sorted into different buckets and technically, are sorted. Therefore, we only had to perform one counting sort to finish and the runtime is that of counting sort.
**Worst Case:** $\Theta(W(N+K))$
The worst case scenario occurs when the majority of the elements are sorted into the same bucket as we move through the keys. Therefore, we would have the same runtime as LSD sort.
## Sort Searching
A really common question on the exam is that we're given a column of numbers that are partially sorted and we have to determine which sort they're referring to.
![](https://i.imgur.com/6x1vMjY.png)
### Strategy
The **best approach** to search problems is to actively look for a sort through all of the columns instead of looking at every column one-by-one and determining the sort. The latter strategy is not as effective because many sorts look very similar to each other.
Identify "easy to identify" sorts before progressing to more difficult ones. Then, use process of elimination to get the "difficult to identify" sorts and the ones you're unsure of.
### Easy To Identify
* **LSD Radix Sort:** Check if one column of **digits** is correctly sorted. Start from the right-most column of digits and move left.
* **MSD Radix Sort:** The **leftmost digits** should be sorted.
* **Selection Sort:** Check if the first part of the list is sorted using all the smallest numbers.
### Moderately Difficult to Identify
* **Heap Sort:** Heap sorts usually rely on a **max heap**. Check that the bottom is in sorted order with the the max numbers and the first number is the next max element.
### Difficult to Identify
* **Merge Sort:** Merge Sort sorts things in groups of 2. Check if **pairs of numbers** are sorted the right order.
* **Insertion Sort:** Check if the first part of the list is sorted by all the numbers you've seen so far. This requires you to run through insertion sort a bit.
* **Quick Sort:** If the pivot is given, run through the unsorted column of numbers with quicksort and see if any other column matches. If no pivot is given, try process of elimination. I recommend looking for quicksort **last** because it's the most time-consuming sort to actively search for.
Try the strategies above to reach the answer below:
![](https://i.imgur.com/3NCpYfO.png)
## Resources and Credits
[VisuAlgo Sorting Visualizer](https://visualgo.net/en/sorting)
[Sorting Crib Sheet](https://d1b10bmlvqabco.cloudfront.net/attach/j9j0udrxjjp758/is6p7xcn6876ht/jgu01m3qenkc/sorting_crib_sheet.pdf): CSM Crib Sheet that covers sorts in class
Counting and Radix Sort Information & Screenshot credited to [CS 61BL Lab 28](https://cs61bl.org/su18/labs/lab28/#radix-sort)