I guess this is the last SLOG update! It's been interesting keeping up a blog of all my thoughts on the new things we learned and assignments. I found it rather difficult to keep updating though, since sometimes I wouldn't know what to talk about and just my tendency to forget to update regularly.
Without further ado, a few brief descriptions of the sorts:
Selection Sort
This sort basically works by finding the maximum or minimum (depending on whether you're going descending or ascending order) and puts it in the correct place (if maximum, then at the end, if minimum then at the start). This process continues, excluding all of the elements that have technically been "sorted"/put into their correct place.
Best run-time complexity is: O(n^2)
If the list is already sorted, it still has to check the whole list for minimum/maximum values.
Worst run-time complexity is: O(n^2)
That is the performance of a completely unsorted list.
Quick Sort
This sort works by taking an index or pivot and then partitioning the rest of the values around that one. It rearranges all of the values <pivot on the left and >pivot on the right. Then you apply the same divide and conquer method for everything left of the pivot and right of the pivot. Once you get to a list of size 0 or 1, then we assume that those are sorted.
Best run-time complexity is: O(n logn)
Any pivot other then the leftmost and rightmost guarantees us this behaviour such as choosing the middle, or a random pivot.
Worst run-time complexity is: O(n^2)
If the pivot point turns out to be skewed to the left or right of the list. This causes the partitioning of a list of 0 elements and a list of the rest of the elements. This applies especially for already sorted lists.
Merge Sort
This sort splits the list in the middle, then continually splits the left and right lists until only a list of one element is left (since that is assumed to be sorted). Then it continually compares sublists and compares them with another until there is only one list left.
Best run-time complexity is: O(n)
If a list is already sorted, it will take O(n) if you check whether the list is already sorted and if it is splitting it log2n times won't be necessary.
Worst run-time complexity is: O(n logn)
An unsorted list. It still uses memory to store the two halves of the lists though.
From the tutorial, we can see how each sort varies to the other by looking at the graphs:
It's interesting to note the efficiency of bubble sort, tim sort (list.sort) and how circumstantial the efficiency of selection sort and insertion sort is.
Hint: Looking at the profile comparisons in tutorial 11 shows that one of the methods that took up quite a huge chunk of time was calling the length of the list. This is bound to hurt for enormous lists, so to optimize it perhaps you could cut down the number of calls to length.