# Sorting # Summary ![](https://i.imgur.com/HRdfdg7.png) # :dog: Insertion Sort ### Description Iterates through the list and swaps them backwards as necessary to maintain sortedness. ### Runtime O($n^2$) ### Additional Info Insertion sort is good if the input is small or nearly sorted. The best case runtime is $\theta(n)$, which is when the array is nearly sorted. # :dog: Selection Sort ### Description Finds the smallest remaining element in the unsorted portion of the list at each step. ### Runtime $\theta(n^2)$ # :dog: Merge Sort ### Description Splits the list in half, applies merge sort to each half, and then merges the two halves together in a zipper fashion. ### Runtime $\theta(nlogn)$ ### Examples For each step of merge sort, what happens is that we divide the list into two subparts and then sort . ![](https://i.imgur.com/B7Ou6rh.png) Source: From cs61b presentation ### Additional Info 1. Merge sort requires extra storage. Allocating and deallocating the extra space used for merge sort increases the running time of the algorithm. Unlike arra, in linked list, we can insert items in the middle in O(1) extra space and O(1) time if we are given reference / pointer to the previous node. Therefore, merge operation of merge sort can be implemented without extra space for linked list. # :dog: Quick Sort ### Description Picks a partition and uses Hoare partitioning to divide the list so that everything greater than the partition is on its right and everything less than the partition is on the left. ### Runtime Average: $O(nlogn)$ Slowest: $O(n^2)$ If is not shuffling the array or we are using the bad pivot, then the quick sort is bad. Quick sort is good if it the pivot is random pivot or array is shuffled. Empirically, quicksort is faster than the merge sort. # :dog: Heap Sort ### Description Heapifies the array into a max heap and pops the largest element off and appends it to the end until there are no elements left in the heap. You can heapify by sinking nodes in reverse level order. ### Runtime $O(nlogn)$ ### Additional Info After creating a heap and then popping, we pop and then compare each element on the left and right branch to see which one is larger and bring them to the root. We can use heapsort to find the median element without sorting the entire list. First, heapify the list, then pop N/2 elements off of the heap. The next element to be popped is the median. ## Practice question 1. Why is mergesort better than quicksort - Better worst case run time. - Easier to sort a linked list with mergesort - Mergesort is stable - More parallelizable because halves of list don't interact until the end. 2. Which sort does not compare the same two elements twice? Answer: - Insertion Sort: We are only comparing with the earlier elements. - Merge Sort: We compare between two new groups for merging. - Quick Sort: Each element is compared to the pivot, and each element becomes a pivot at most once. 3. Is there a sorting method that runs in best case Theta(logN) time for certain inputs? Answer: No, since for sorting, we have to look through all the elements at least once. Therefore, it will at the least take theta(n) time. 4. We are comparing two arrays (which have unique elements) and we have to change the ordering of the two arrays so that their values are matching up. Answer: The solution modifies quicksort. 1. Choose a pivot from the Bears list. 3. Partition the Beds into three grups - those that are less than, equal to, and greater than the pivot bear. 4. Given that the beds and bears have unique sizes, we know that exactly one bed will be equal to the pivot bear. 5. We partition the bears into three groups -- those less than, equal to, and greater than the pivot bed. 6. We match the pivot Bear with the pivot bed by adding them to the Bears and Beds list at the same index (which is the same as adding them to the end of the array.) 7. We have the recursive calls where the first recursive calls have beds and bears that are less than the respective pivots. 8. The second recursive call contains beds and bears that are greater than the pivot.