Quicksort is one of the efficient and most commonly used algorithms. Most of the languages used quicksort in their inbuild “sort” method implementations. The name comes from the fact that it sorts data faster than any commonly available sorting algorithm and like Merge sort, it also follows the divide and conquer principle.
Quicksort, in particular, is an interesting one nevertheless it takes enough time to get your head around it. Quicksort breaks down the problem of sorting the complete array into smaller problems by breaking down the array into smaller subarray. This technique of splitting the bigger problem into smaller problems is called divide and conquer approach.
Divide part — Divide the problem into two smaller sub-problems by partition the array X[l…r] into two smaller subarrays around some pivot value in the array. Partition algorithm returns the pivotIndex (index of the pivot in the sorted array) after the partition.
Left Subproblem — Sorting array X from l to pivotIndex -1
Right Subproblem— Sorting array X from pivotIndex +1 to r
Conquer Part — Now, we solve both the sub-problems or sort both the smaller arrays recursively. We need not worry about the solutions to the sub-problems because recursion will do the rest of the work for us.
Combine Part — There is no need to combine parts because after sorting both smaller arrays, our whole array will get sorted automatically. Why? Think.
quickSort(array, leftmostIndex, rightmostIndex)
if (leftmostIndex < rightmostIndex)
pivotIndex <- partition(array,leftmostIndex, rightmostIndex)
quickSort(array, leftmostIndex, pivotIndex - 1)
quickSort(array, pivotIndex, rightmostIndex)
partition(array, leftmostIndex, rightmostIndex)
set rightmostIndex as pivotIndex
storeIndex <- leftmostIndex - 1
for i <- leftmostIndex + 1 to rightmostIndex
if element[i] < pivotElement
swap element[i] and element[storeIndex]
swap pivotElement and element[storeIndex+1]
return storeIndex + 1
Characteristics of Quicksort
The performance of quicksort is in order of nlogn for most of the cases. If the list of elements is already in sorted order or nearly sorted order then it takes n² comparisons to sort the array.
Quicksort has a space complexity of O(logn) even in the worst case when it is carefully implemented such that
in-place partitioning is used. This requires O(1).
After partitioning, the partition with the fewest elements is (recursively) sorted first, requiring at most O(logn) space. Then the other partition is sorted using tail-recursion or iteration.
In-place algorithm: As we have seen above the Quicksort practically needs more than constant O(1) space. It needs memory in order of O(log n) this completely disqualifies quicksort as an in-place algorithm. However, quicksort is always considered to be in-place, it sorts the elements within the array with at most a constant amount of them outside the array at any given time.
Unstable: Quicksort has one bigger difference with merge sort, and this ends up being a key point between these otherwise similar algorithms. This is the fact that quicksort is an unstable algorithm, meaning that it won’t be guaranteed to preserve the order of elements as it reorders; two elements that are exactly the same in the unsorted array could show up in a reversed order in the sorted one! Stability ends up being what causes people to choose merge sort over quicksort, and it’s the one area where merge sort appears as the obvious winner.
Applications of Quick Sort
Before diving into any algorithm, its very much necessary for us to understand what are the real world applications of it. Quick sort provides a fast and methodical approach to sort any lists of things. Following are some of the applications where quick sort is used.
Commercial computing: Used in various government and private organizations for the purpose of sorting various data like sorting of accounts/profiles by name or any given ID, sorting transactions by time or locations, sorting files by name or date of creation etc.
Numerical computations: Most of the efficiently developed algorithms use priority queues and inturn sorting to achieve accuracy in all the calculations.
Information search: Sorting algorithms aid in better search of information and what faster way exists than to achieve sorting using quick sort.
Implementation in Python
Implementation in C++
Worst Case Complexity [Big-O]: O(n2)
It occurs when the pivot element picked is either the greatest or the smallest element. This condition leads to the case in which the pivot element lies in an extreme end of the sorted array. One sub-array is always empty and another sub-array contains n - 1 elements. Thus, quicksort is called only on this sub-array. However, the quicksort algorithm has better performance for scattered pivots.
Best Case Complexity [Big-omega]: O(n*log n)
It occurs when the pivot element is always the middle element or near to the middle element.
Average Case Complexity [Big-theta]: O(n*log n)
It occurs when the above conditions do not occur.
Space Complexity: The space complexity for quicksort is O(log n).
With this article at Logicmojo, you must have the complete idea of analyzing Quick Sort algorithm.
Good Luck & Happy Learning!!