*4.3K*

If you’ve ever taken a programming course, even beginner ones, or if you have ever tried to program anything more complex than a simple “Hello World” program, you most likely have had to think about using both data structures and algorithms that work on those data structures. One of the first types of algorithms studied is sorting algorithms.

Sometimes, it’s difficult to realize which sorting algorithm may be best to use for a specific use case when you need to sort data in a data structure. It can also depend on what type of data structure you’re working with. That’s what this article is about. Helping you to know which of the most common sorting algorithms may be best to use in different scenarios.

Among the most common sorting algorithms are:

- Insertion Sort
- Bubble Sort
- Selection Sort
- Merge sort
- Quick Sort

Let’s go over each of these in a little detail to see the differences between them.

**Insertion Sort**

Insertion sort is one of the easier algorithms to implement. The summary description of how this algorithm works is this:* Start iterating over each item in the data structure (most of the time is an array or vector), starting with the second element in the data structure, check the current element’s value to the element prior to it. For every element’s value in the data set that comes before the current element, if its value is less than the current element’s value, move that element up a position in the set and keep cycling through until there is an element whose value is not less than the current element’s value. And place the current element’s value in that spot.*

A good visual of how this algorithm works can be found here.

**Best Times To Use This Algorithm**

- When the data set is relatively small
- When items in the data set are already somewhat sorted

***Note: Some more advanced algorithms that are much better at sorting large data sets, will use insertion sort if they break down the data set into smaller data sets using recursion. They utilize Insertion Sort once the data set has been broken down into smaller* *data set*s

**Running Time Complexity**

**Worst-case performance:** **O(n²) — **So if you have 100 elements in your array, at worst, this algorithm will do 10,000 comparisons. This is when your data set is sorted completely opposite of how you wish to sort the data.

**Average case performance: O(n²) — **The average case won’t quite be 10,000 comparisons, but it will be higher than O(n) or O(nlogn) time.

**Best-case performance: O(n) — **The best case performance is when the data set is already sorted. Thus it only needs to iterate each item in the array and do one comparison for each item in the data set.

**Example code snippet (JavaScript):**

`function insertionSort(list) {`

const len = list.length;

for (var i = 1; i < len; i++) {

var tmp = list[i];

for (var j = i - 1; j >= 0 && (list[j] > tmp); j--){

list[j + 1] = list[j];

}

list[j + 1] = tmp;

}

}

**Bubble Sort**

Bubble sort is another algorithm that’s relatively easy to implement. The summary description of how this algorithm works is this: *Compare pairs of elements at a time, swapping them if the first element in the pair is greater than the second element in the pair. For example, comparing a pair of elements [8, 3], after the comparison of the pair, the order would be [3, 8]. You continue doing this in pairs throughout the data set until you reach the last pair of elements. For each iteration through the data set, there will be n-x comparisons, where x is the current iteration through the data set (0-based index).*

A good visual of how this algorithm works can be found here.

**Best Times To Use This Algorithm**

- When the data set is relatively small
- When items in the data set are already somewhat sorted

**Note: This algorithm is not advised on any large data sets. Especially those where the elements may be almost sorted completely in reverse order.*

**Running Time Complexity**

**Worst-case performance:** **O(n²) — **So if you have 100 elements in your array, at worst, this algorithm will do 10,000 comparisons. This is when your data set is sorted completely opposite of how you wish to sort the data.

**Average case performance: O(n²) — **The average case won’t quite be 10,000 comparisons, but it will be higher than O(n) or O(nlogn) time.

**Best-case performance: O(n) — **The best case performance is when the data set is already sorted. Thus it only needs to iterate each item in the array and do one comparison for each item in the data set.

**Example code snippet (JavaScript):**

`function bubbleSort(list) {`

const length = list.length;

for (var i = 0; i < length; i++) {

for (var j = 0; j < (length - i - 1); j++) {

if (list[j] > list[j + 1]) {

var tmp = list[j];

list[j] = list[j + 1];

list[j + 1] = tmp;

}

}

}

}

**Selection Sort**

Selection sort is another algorithm that’s relatively easy to implement. The summary description of how this algorithm works is this: *Iterate over the data set, starting with the first element. Create a reference to that position in the data set, then iterate over all the elements*. G*rab the lowest value in the entire list by keeping a variable of the lowest value seen through this iteration, and swapping it with the current element. On each iteration, the index is 1 position higher in the set than the previous iteration. Once the index is on the last element in the data set, the set should be sorted.*

A good visual of how this algorithm works can be found here.

**Best Times To Use This Algorithm**

- When the data set is relatively small

*Note: This algorithm is not advised on any large data sets. It is very similar to Insertion Sort in many ways, but Insertion Sort would still be more optimal as it has the possibility of fewer comparisons than Selection Sort. However, Selection Sort does have the advantage of fewer writes/swaps than Insertion Sort.*

**Running Time Complexity**

**Worst-case performance:** **O(n²) —** The worst case** **is the same as both the worst case and best case. This algorithm will still need to do O(n²) comparisons to make sure everything is in order. Which is one disadvantage of this particular algorithm.

**Average case performance: O(n²) — **The average case is the same as both the worst case and best case. This algorithm will still need to do O(n²) comparisons to make sure everything is in order. Which is one disadvantage of this particular algorithm.

**Best-case performance: O(n²) — **The best-case performance is the same as both the worst case and average case. This algorithm will still need to do O(n²) comparisons to make sure everything is in order. Which is one disadvantage of this particular algorithm.

**Example code snippet (JavaScript):**

`function selectionSort(list) {`

const length = list.length;

for (var i = 0; i < length - 1; i++) {

var min = i;

for (var j = i + 1; j < length; j++) {

if (list[j] < list[min]) {

min = j;

}

}

if (min != i) {

//Swap the numbers

var tmp = list[i];

list[i] = list[min];

list[min] = tmp;

}

}

}

## Merge Sort

Merge sort is a more advanced algorithm that’s not quite as easy to implement as the algorithms above. The summary description of how this algorithm works is this:* Take the entire list* and keep breaking it in half until it is down to the smallest possible list, a single element in each list. Then compare it to the list adjacent to it, *sort the elements in those lists and then merge the adjacent lists, and reiterate over the next set of lists. Keep doing this until the list is sorted. This is one of the sorting algorithms that normally uses recursion and solves the bigger problem by breaking the problem down into smaller sub-problems known as divide and conquer algorithms.*

A good visual of how this algorithm works can be found here.

**Best Times To Use This Algorithm**

- Larger data sets
- When there are no memory or storage constraints, the most common implementations of merge sort use a secondary list using the same amount of space as the original list to help with sorting.
- When items in the data set can be sequentially accessed

**Running time complexity:**

**Worst-case performance:** **O(nlogn) — **The worst case** **is the same as both worst case and best case.

**Average case performance: O(nlogn) — **The average case is the same as both the worst case and best case.

**Best-case performance: O(nlogn) — **The best-case performance is the same as both the worst case and average case.

**Example code snippet (JavaScript)**

`function mergeSort(arr) {`

if (arr.length < 2) {

return arr;

}

var mid = Math.floor(arr.length / 2);

var subLeft = mergeSort(arr.slice(0, mid));

var subRight = mergeSort(arr.slice(mid));

return merge(subLeft, subRight);

}function merge(node1, node2) {

var result = [];

while (node1.length > 0 && node2.length > 0) { result.push(node1[0] < node2[0]? node1.shift() : node2.shift()); return result.concat(node1.length? node1 : node2);

}

## Quicksort

Quick sort is a more advanced algorithm that’s not quite as easy to implement as the simpler algorithms at the start of this article. It is similar to merge sort, as it uses the divide and conquers approach to sorting. The summary description of how this algorithm works is this:* Pick an element from the data set, called the pivot, and this is the point where the data set is divided. Move all elements in the data set that are less than the pivot value to come before the pivot. Recursively apply the same steps for each of the partitions created by the previous step.*

A good visual of how this algorithm works can be found here.

**Best Times To Use This Algorithm**

- Large data sets
- When the ordering of equal elements is not important. Quicksort is not a stable algorithm, which means that once the data set is ordered, elements whose values are equal, are not guaranteed to be in the same ordering as the initial data set.

**Running Time Complexity**

**Worst-case performance:** **O(n²) — **For Quicksort, n² running time complexity is rare. So this should not be very worrisome.

**Average case performance: O(nlogn) — **The average case is the same as the best case.

**Best-case performance: O(nlogn) — **The best-case performance is the same as the average case.

## Conclusion

Now that you are familiar with each of these algorithms, go and write some great performing software that impresses people!