Quick Sort
Last updated
Was this helpful?
Last updated
Was this helpful?
QuickSort is also a Divide And Conquer algorithm with a performance of $O(n * log(n))$, however in the worst case this becomes $O(n^2)$.
We start by picking an element which is called the pivot, then we will rearrange the array so that every element smaller than the pivot will be on the left of the pivot and all elements larger than the pivot will be on the right. this operation is called the partitioning.
Now we just recursively apply the above steps to the sub array of elements.
Take the following array:
We pick our pivot (the middle element, don't forget the rounding of course), the brackets show the index points.
We start from index 0 and index arraySize - 1, we compare the values and swap if needed. So here we have 9 and 7, 9 > 5 < 7 so we increase the index by 1 for the right side, because 7 is in it's correct spot.
Again 9 > 5 < 6, so we increment the right side index
We now see that 9 > 5 > 1, which means we should swap 1 and 9. We can also increment the left and the right index.
0 < 5 > 2, increment left side.
8 > 5 < 2, swap
3 < 5 > 4, increment the left index. This will now become our pivot! so this is the last step.
5 > 4, swap
We now have 2 sub arrays, a left one and a right one, run this method recursively on these 2 sub arrays.
On the end we will have our sorted array:
For the implementation check paragraph §6.4
.
Advantages
Fast overall
Disadvantages
Not stable
$O(lg(n))$ extra space
$\Theta(n^2))$ performance in the worst case
Not adaptive
Worst Case
Average Case
Best Case
O(n2
O(n log n)
O(n log n)
Note that the performance goes towards $O(n)$ using a three-way partition and equal keys.
Both parts are the same size
$T(n) = cn + 2T(\frac{n}{2})$
which translates into:
$T(n) = O(n*lg(n))$
Happens when pivot is in the middle of the elements
The average case is something that is really tricky for quick sort. If we take the previous point about the pivot being in the middle of the elements into consideration, then we get that the execution time becomes:
Because the most terms appear twice we get:
Hard sum, but by using full history recurrence we are able to remove it by subtracting it with the same formula off $T(n - 1)$. And multiplying it with n.
Remove the c:
Now we perform the classic technique telescoping after we divided both terms by $n(n + 1)$:
The last number is the harmonic number, this has the value $ln(n) + O(1)$. This results in $O(n*lg(n))$.
One of the 2 parts is just one element
Count everything together:
This results into our performance becoming $O(n^2)$ which is just as bad as insertion sort. We can evade this problem if our pivot has been placed correctly.
Quick Sort has 2 main steps, the partition step and the recursion step.
In the partition step we will put every element smaller than the pivot at the left, and every element larger than the pivot at the right.
Then we will recursively call our Quick Sort method till we got a sorted array.
100
1.000
10.000
100.000
1.000.000
Random Elements
8e-06
0.000105
0.001301
0.015583
0.179599
Ascending Elements
3e-06
2.6e-05
0.000329
0.004379
0.043418
Descending Elements
3e-06
2.7e-05
0.000354
0.004304
0.046504
QuickSort is fun to implement and is really fast when we do not hit the worst case scenario, because we are able to prevent hitting this scenario this is a commonly used sorting algorithm. It is so common that the STL library has implemented this as qsort
.
There is also a very famous variant on QuickSort called Three Way Pivot QuickSort, this algorithm is faster since 2009 and is also implemented in Java 7.