We looked into some of the most common sorting algorithms. Namely, Bubble Sort, Selection Sort, Insertion Sort, Merge Sort and Quick Sort. Tim Sort which is the algorithm used in Python's built-in sort method was also mentioned. We compared these algorithms' efficiencies with each other. To compare them we use the big-oh notation of their worst cases.
The big-oh notation is a short way of stating how many steps it takes an algorithm to run. It is stated as O(n), O(n^3), O(logn) etc. n is the size of the input, and in our case(sorting) it is the length of the input list. We determine the big-oh of an algorithm by the number of steps it takes to run an input of size n. For example for a certain algorithm it can take 2n^2 + 32n + 54 steps to run an input of size n. This algorithm has the big-oh notation O(n^2). As you can se we disregard the coefficients and the constants, and only take the leading term, since everything other than the leading term becomes insignificant as n gets larger and larger.
I won't go through all of the sorting types and determine their big-oh notations of their worst cases, because that would make a looong SLOG post. But I'll just give the answers (btw you can find a cheat sheet for these and more here).
Worst case runtime of certain sorting algorithms:
Bubble: O(n^2)
Selection: O(n^2)
Insertion: O(n^2)
Merge: O(n log(n))
Quick: O(n^2)
Yeah I know they mostly look the same but keep in mind that these are the worst case runtimes. But on average cases Merge Sort and Quick Sort are much more efficient (O(n log(n))).
Even Barack Obama has some insight on the subject, check it out here: